Feb 02 10:38:31 crc systemd[1]: Starting Kubernetes Kubelet... Feb 02 10:38:31 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:31 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:38:32 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 02 10:38:33 crc kubenswrapper[4901]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.379014 4901 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.383880 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.383966 4901 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.383977 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.383988 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384000 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384011 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384022 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384030 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384039 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384048 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384056 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384077 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384085 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384093 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384102 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384115 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384127 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384136 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384147 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384155 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384165 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384174 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384184 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384201 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384212 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384221 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384235 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384248 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384261 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384275 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384284 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384294 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384304 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384313 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384331 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384339 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384348 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384356 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384365 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384373 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384382 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384390 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384399 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384407 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384416 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384424 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384433 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384455 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384466 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384475 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384484 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384493 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384503 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384512 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384529 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384541 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384551 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384603 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384618 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384626 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384636 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384645 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384654 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384663 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384673 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384682 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384693 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384703 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384712 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384721 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.384730 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386119 4901 flags.go:64] FLAG: --address="0.0.0.0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386650 4901 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386681 4901 flags.go:64] FLAG: --anonymous-auth="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386695 4901 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386709 4901 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386721 4901 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386739 4901 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386755 4901 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386768 4901 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386780 4901 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386793 4901 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386803 4901 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386813 4901 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386823 4901 flags.go:64] FLAG: --cgroup-root="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386832 4901 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386852 4901 flags.go:64] FLAG: --client-ca-file="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386861 4901 flags.go:64] FLAG: --cloud-config="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386871 4901 flags.go:64] FLAG: --cloud-provider="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386879 4901 flags.go:64] FLAG: --cluster-dns="[]" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386893 4901 flags.go:64] FLAG: --cluster-domain="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386902 4901 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386912 4901 flags.go:64] FLAG: --config-dir="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386921 4901 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386931 4901 flags.go:64] FLAG: --container-log-max-files="5" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386974 4901 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386983 4901 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.386993 4901 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387002 4901 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387022 4901 flags.go:64] FLAG: --contention-profiling="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387032 4901 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387043 4901 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387055 4901 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387064 4901 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387080 4901 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387089 4901 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387099 4901 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387110 4901 flags.go:64] FLAG: --enable-load-reader="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387119 4901 flags.go:64] FLAG: --enable-server="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387129 4901 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387143 4901 flags.go:64] FLAG: --event-burst="100" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387152 4901 flags.go:64] FLAG: --event-qps="50" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387161 4901 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387170 4901 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387179 4901 flags.go:64] FLAG: --eviction-hard="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387191 4901 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387200 4901 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387209 4901 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387218 4901 flags.go:64] FLAG: --eviction-soft="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387228 4901 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387237 4901 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387246 4901 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387255 4901 flags.go:64] FLAG: --experimental-mounter-path="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387264 4901 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387272 4901 flags.go:64] FLAG: --fail-swap-on="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387281 4901 flags.go:64] FLAG: --feature-gates="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387293 4901 flags.go:64] FLAG: --file-check-frequency="20s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387303 4901 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387314 4901 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387323 4901 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387332 4901 flags.go:64] FLAG: --healthz-port="10248" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387342 4901 flags.go:64] FLAG: --help="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387351 4901 flags.go:64] FLAG: --hostname-override="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387360 4901 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387370 4901 flags.go:64] FLAG: --http-check-frequency="20s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387379 4901 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387392 4901 flags.go:64] FLAG: --image-credential-provider-config="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387401 4901 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387410 4901 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387419 4901 flags.go:64] FLAG: --image-service-endpoint="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387429 4901 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387439 4901 flags.go:64] FLAG: --kube-api-burst="100" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387448 4901 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387459 4901 flags.go:64] FLAG: --kube-api-qps="50" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387469 4901 flags.go:64] FLAG: --kube-reserved="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387478 4901 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387486 4901 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387496 4901 flags.go:64] FLAG: --kubelet-cgroups="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387505 4901 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387515 4901 flags.go:64] FLAG: --lock-file="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387523 4901 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387532 4901 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387542 4901 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387558 4901 flags.go:64] FLAG: --log-json-split-stream="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387603 4901 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387612 4901 flags.go:64] FLAG: --log-text-split-stream="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387621 4901 flags.go:64] FLAG: --logging-format="text" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387631 4901 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387641 4901 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387650 4901 flags.go:64] FLAG: --manifest-url="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387662 4901 flags.go:64] FLAG: --manifest-url-header="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387687 4901 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387697 4901 flags.go:64] FLAG: --max-open-files="1000000" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387710 4901 flags.go:64] FLAG: --max-pods="110" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387722 4901 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387735 4901 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387747 4901 flags.go:64] FLAG: --memory-manager-policy="None" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387758 4901 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387771 4901 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387781 4901 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387791 4901 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387820 4901 flags.go:64] FLAG: --node-status-max-images="50" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387829 4901 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387839 4901 flags.go:64] FLAG: --oom-score-adj="-999" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387848 4901 flags.go:64] FLAG: --pod-cidr="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387857 4901 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387871 4901 flags.go:64] FLAG: --pod-manifest-path="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387881 4901 flags.go:64] FLAG: --pod-max-pids="-1" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387890 4901 flags.go:64] FLAG: --pods-per-core="0" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387899 4901 flags.go:64] FLAG: --port="10250" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387909 4901 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387920 4901 flags.go:64] FLAG: --provider-id="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387929 4901 flags.go:64] FLAG: --qos-reserved="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387938 4901 flags.go:64] FLAG: --read-only-port="10255" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387948 4901 flags.go:64] FLAG: --register-node="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387957 4901 flags.go:64] FLAG: --register-schedulable="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387966 4901 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.387993 4901 flags.go:64] FLAG: --registry-burst="10" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388003 4901 flags.go:64] FLAG: --registry-qps="5" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388012 4901 flags.go:64] FLAG: --reserved-cpus="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388023 4901 flags.go:64] FLAG: --reserved-memory="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388035 4901 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388044 4901 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388054 4901 flags.go:64] FLAG: --rotate-certificates="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388063 4901 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388072 4901 flags.go:64] FLAG: --runonce="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388081 4901 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388091 4901 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388100 4901 flags.go:64] FLAG: --seccomp-default="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388109 4901 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388118 4901 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388128 4901 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388137 4901 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388147 4901 flags.go:64] FLAG: --storage-driver-password="root" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388156 4901 flags.go:64] FLAG: --storage-driver-secure="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388165 4901 flags.go:64] FLAG: --storage-driver-table="stats" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388174 4901 flags.go:64] FLAG: --storage-driver-user="root" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388183 4901 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388193 4901 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388202 4901 flags.go:64] FLAG: --system-cgroups="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388211 4901 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388227 4901 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388236 4901 flags.go:64] FLAG: --tls-cert-file="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388245 4901 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388258 4901 flags.go:64] FLAG: --tls-min-version="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388267 4901 flags.go:64] FLAG: --tls-private-key-file="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388275 4901 flags.go:64] FLAG: --topology-manager-policy="none" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388284 4901 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388293 4901 flags.go:64] FLAG: --topology-manager-scope="container" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388303 4901 flags.go:64] FLAG: --v="2" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388317 4901 flags.go:64] FLAG: --version="false" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388329 4901 flags.go:64] FLAG: --vmodule="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388340 4901 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.388350 4901 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388711 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388727 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388742 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388754 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388764 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388774 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388782 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388789 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388797 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388805 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388813 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388821 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388832 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388841 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388849 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388856 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388865 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388875 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388884 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388894 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388904 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388914 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388924 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388933 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388941 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388949 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388958 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388971 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388981 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.388991 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389000 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389014 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389028 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389039 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389092 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389102 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389112 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389123 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389135 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389144 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389154 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389165 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389173 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389181 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389193 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389203 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389212 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389222 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389233 4901 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389242 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389251 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389259 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389269 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389282 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389295 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389305 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389315 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389325 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389338 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389346 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389353 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389364 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389374 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389384 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389394 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389402 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389410 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389418 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389426 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389434 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.389442 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.389456 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.401880 4901 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.401917 4901 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402004 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402014 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402019 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402025 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402031 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402035 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402040 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402045 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402050 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402055 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402060 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402065 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402071 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402076 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402083 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402088 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402094 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402099 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402105 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402111 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402116 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402121 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402126 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402131 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402137 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402143 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402149 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402155 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402161 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402166 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402171 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402175 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402180 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402198 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402212 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402218 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402222 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402227 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402232 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402239 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402245 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402250 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402255 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402260 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402265 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402272 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402278 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402283 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402289 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402295 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402301 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402307 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402314 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402320 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402326 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402331 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402336 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402341 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402347 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402353 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402358 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402363 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402368 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402373 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402378 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402383 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402388 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402393 4901 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402397 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402402 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402415 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.402424 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402595 4901 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402603 4901 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402608 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402613 4901 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402618 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402623 4901 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402628 4901 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402633 4901 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402638 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402643 4901 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402648 4901 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402652 4901 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402658 4901 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402663 4901 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402669 4901 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402674 4901 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402679 4901 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402685 4901 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402691 4901 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402696 4901 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402701 4901 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402706 4901 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402711 4901 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402716 4901 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402720 4901 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402725 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402730 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402737 4901 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402742 4901 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402747 4901 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402752 4901 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402756 4901 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402761 4901 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402766 4901 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402779 4901 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402784 4901 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402789 4901 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402794 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402799 4901 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402805 4901 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402811 4901 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402817 4901 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402823 4901 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402828 4901 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402834 4901 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402839 4901 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402845 4901 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402850 4901 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402855 4901 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402860 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402865 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402870 4901 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402875 4901 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402879 4901 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402884 4901 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402889 4901 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402894 4901 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402898 4901 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402903 4901 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402910 4901 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402915 4901 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402920 4901 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402925 4901 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402929 4901 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402934 4901 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402940 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402945 4901 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402950 4901 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402957 4901 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402964 4901 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.402981 4901 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.402989 4901 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.404159 4901 server.go:940] "Client rotation is on, will bootstrap in background" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.408724 4901 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.408823 4901 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.410888 4901 server.go:997] "Starting client certificate rotation" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.410967 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.412148 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-10 04:57:37.77906226 +0000 UTC Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.412349 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.438397 4901 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.443973 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.445699 4901 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.466556 4901 log.go:25] "Validated CRI v1 runtime API" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.508828 4901 log.go:25] "Validated CRI v1 image API" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.512717 4901 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.519051 4901 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-02-10-29-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.519104 4901 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.554209 4901 manager.go:217] Machine: {Timestamp:2026-02-02 10:38:33.550129252 +0000 UTC m=+0.568469438 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9f2b45ef-ead6-4cce-86c8-26b6d26ee095 BootID:f285d71e-6d99-440a-8549-56e6a3710e3e Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f4:fb:68 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f4:fb:68 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:58:a4:11 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a7:77:87 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:10:60:ae Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:80:89:18 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:e5:cb:fb:4b:46 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:f6:4b:61:31:14:d4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.554805 4901 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.555080 4901 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.555601 4901 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.555936 4901 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.555990 4901 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.556393 4901 topology_manager.go:138] "Creating topology manager with none policy" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.556422 4901 container_manager_linux.go:303] "Creating device plugin manager" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.557059 4901 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.557113 4901 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.557470 4901 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.557664 4901 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.563718 4901 kubelet.go:418] "Attempting to sync node with API server" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.563775 4901 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.563839 4901 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.563863 4901 kubelet.go:324] "Adding apiserver pod source" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.563882 4901 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.568932 4901 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.569031 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.569148 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.569324 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.569443 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.570266 4901 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.573023 4901 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575345 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575393 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575409 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575422 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575447 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575462 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575475 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575499 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575516 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575531 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575552 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.575594 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.576521 4901 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.577446 4901 server.go:1280] "Started kubelet" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.577500 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.578791 4901 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.578793 4901 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.579957 4901 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 02 10:38:33 crc systemd[1]: Started Kubernetes Kubelet. Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.581118 4901 server.go:460] "Adding debug handlers to kubelet server" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.581357 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.585015 4901 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.585625 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:16:02.027426223 +0000 UTC Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.587678 4901 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.587705 4901 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.588526 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.588324 4901 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.590146 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.590201 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="200ms" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.589772 4901 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.590469 4901 factory.go:55] Registering systemd factory Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.590288 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.590494 4901 factory.go:221] Registration of the systemd container factory successfully Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.589509 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189067bcab05daf2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:38:33.57740517 +0000 UTC m=+0.595745296,LastTimestamp:2026-02-02 10:38:33.57740517 +0000 UTC m=+0.595745296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.592473 4901 factory.go:153] Registering CRI-O factory Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.592503 4901 factory.go:221] Registration of the crio container factory successfully Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.592531 4901 factory.go:103] Registering Raw factory Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.592584 4901 manager.go:1196] Started watching for new ooms in manager Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.598003 4901 manager.go:319] Starting recovery of all containers Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.605960 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606173 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606193 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606206 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606220 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606233 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606260 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606274 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606289 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606310 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606323 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606339 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606382 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606399 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606411 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606425 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606437 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606452 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606466 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606510 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606525 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606539 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606551 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606590 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606603 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606616 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606652 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606667 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606691 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606702 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606713 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606727 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606739 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606751 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606764 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606775 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606831 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606844 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606856 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606869 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606881 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606969 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.606984 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607000 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607016 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607036 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607050 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607062 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607075 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607088 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607100 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607114 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607139 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607160 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607176 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607189 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607204 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607216 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607228 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607243 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607255 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607267 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607280 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607292 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607305 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607317 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607330 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607342 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607355 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607369 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607382 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607394 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607405 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607417 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607428 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607443 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607455 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607467 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607478 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607492 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607504 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607515 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607597 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607609 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607621 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607634 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607646 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607657 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607669 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607682 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607693 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607707 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607718 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607729 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607742 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607755 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607767 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607781 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607794 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607835 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607848 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607869 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607882 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607893 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607916 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607931 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607944 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607958 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607971 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.607986 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608000 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608013 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608026 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608039 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608051 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608062 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608074 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608089 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608100 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608118 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608133 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608156 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608173 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608190 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608201 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.608212 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.611011 4901 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.611546 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612319 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612341 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612365 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612390 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612405 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612421 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612443 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612460 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612478 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612493 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612507 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612524 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612538 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612552 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612587 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612601 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612615 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612629 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612646 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612660 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612680 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612692 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612715 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612729 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612748 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612767 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612785 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612806 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612822 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612839 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612855 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612867 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612886 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612902 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612916 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612942 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612957 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612969 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612982 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.612995 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613007 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613056 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613078 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613106 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613139 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613168 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613187 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613208 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613236 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613257 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613284 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613305 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613334 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613353 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613382 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613401 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613421 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613450 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613470 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613492 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613512 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613535 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613588 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613609 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613630 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613651 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613681 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613709 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613730 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613757 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613785 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613805 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613828 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613856 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613877 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613902 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613922 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613941 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613962 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.613981 4901 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.614002 4901 reconstruct.go:97] "Volume reconstruction finished" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.614017 4901 reconciler.go:26] "Reconciler: start to sync state" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.637877 4901 manager.go:324] Recovery completed Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.672141 4901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.675358 4901 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.675419 4901 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.675449 4901 kubelet.go:2335] "Starting kubelet main sync loop" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.675508 4901 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 02 10:38:33 crc kubenswrapper[4901]: W0202 10:38:33.676656 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.676711 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.677732 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.679422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.679453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.679466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.680061 4901 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.680086 4901 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.680109 4901 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.689498 4901 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.705269 4901 policy_none.go:49] "None policy: Start" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.705988 4901 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.706018 4901 state_mem.go:35] "Initializing new in-memory state store" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.775766 4901 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.776317 4901 manager.go:334] "Starting Device Plugin manager" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.776430 4901 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.776471 4901 server.go:79] "Starting device plugin registration server" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.777244 4901 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.777284 4901 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.777546 4901 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.777713 4901 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.777729 4901 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.787913 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.790954 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="400ms" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.878285 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.879903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.879964 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.879981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.880023 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:33 crc kubenswrapper[4901]: E0202 10:38:33.880686 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.976275 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.976527 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.979008 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.979077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.979096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.979330 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.980615 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.980683 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.980838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.980869 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.980885 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.981099 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.981310 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.981364 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982556 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982617 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.982758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.984011 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.984482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.984725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.985504 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.985623 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.985987 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.987418 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.987441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.987449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.988415 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.988463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.988478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.988700 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.988923 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.989001 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.991098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.991124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.991140 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.993212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.993447 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.993679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.994415 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.995325 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.997532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.997781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:33 crc kubenswrapper[4901]: I0202 10:38:33.998085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019585 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019654 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019796 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019835 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019863 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019894 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.019961 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020106 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020183 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020224 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020259 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020291 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020345 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020421 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.020470 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.080839 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.085133 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.085209 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.085231 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.085285 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.086765 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122462 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122550 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122675 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122687 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122755 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122769 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122776 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122814 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122879 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122907 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.122970 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123042 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123059 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123163 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123280 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123314 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123325 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123348 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123397 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123417 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123429 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123443 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123488 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.123810 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.192251 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="800ms" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.311995 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.334665 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.358824 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.374486 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.381936 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.387695 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-96e8e1bdc90a4b9612fdf92afb3922cb6d139140c649ed7f6625db43a33ff745 WatchSource:0}: Error finding container 96e8e1bdc90a4b9612fdf92afb3922cb6d139140c649ed7f6625db43a33ff745: Status 404 returned error can't find the container with id 96e8e1bdc90a4b9612fdf92afb3922cb6d139140c649ed7f6625db43a33ff745 Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.410139 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0bfc44234fd921747797b27909eeab561c2b53d220044ed063ce1b975f69e025 WatchSource:0}: Error finding container 0bfc44234fd921747797b27909eeab561c2b53d220044ed063ce1b975f69e025: Status 404 returned error can't find the container with id 0bfc44234fd921747797b27909eeab561c2b53d220044ed063ce1b975f69e025 Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.414067 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2ebe9a9b23e86504aa11eeeb6809acbe96fde7f603e375df6777f60ca26403fb WatchSource:0}: Error finding container 2ebe9a9b23e86504aa11eeeb6809acbe96fde7f603e375df6777f60ca26403fb: Status 404 returned error can't find the container with id 2ebe9a9b23e86504aa11eeeb6809acbe96fde7f603e375df6777f60ca26403fb Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.415721 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e44d0874332de929ad0fc74272da96996e8dc778ce115cdb8065bd1fd31a6e2f WatchSource:0}: Error finding container e44d0874332de929ad0fc74272da96996e8dc778ce115cdb8065bd1fd31a6e2f: Status 404 returned error can't find the container with id e44d0874332de929ad0fc74272da96996e8dc778ce115cdb8065bd1fd31a6e2f Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.442208 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.442319 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.487869 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.490093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.490154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.490174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.490219 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.491010 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.554831 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.555011 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.578639 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.586781 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:55:44.899285127 +0000 UTC Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.683043 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"96e8e1bdc90a4b9612fdf92afb3922cb6d139140c649ed7f6625db43a33ff745"} Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.684591 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e4c1668ca7133aeba08dc740d3d65592d64ec8a85715bd385a769a7c6cf97752"} Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.686041 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e44d0874332de929ad0fc74272da96996e8dc778ce115cdb8065bd1fd31a6e2f"} Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.687341 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2ebe9a9b23e86504aa11eeeb6809acbe96fde7f603e375df6777f60ca26403fb"} Feb 02 10:38:34 crc kubenswrapper[4901]: I0202 10:38:34.688871 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0bfc44234fd921747797b27909eeab561c2b53d220044ed063ce1b975f69e025"} Feb 02 10:38:34 crc kubenswrapper[4901]: W0202 10:38:34.780094 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.780221 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:34 crc kubenswrapper[4901]: E0202 10:38:34.993097 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="1.6s" Feb 02 10:38:35 crc kubenswrapper[4901]: W0202 10:38:35.180055 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:35 crc kubenswrapper[4901]: E0202 10:38:35.180225 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.291133 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.292448 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.292484 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.292502 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.292532 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:35 crc kubenswrapper[4901]: E0202 10:38:35.293214 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.561113 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:38:35 crc kubenswrapper[4901]: E0202 10:38:35.563942 4901 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.579038 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.587203 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 21:24:45.685048491 +0000 UTC Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.695259 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7" exitCode=0 Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.695344 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.695548 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.697065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.697125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.697148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.698386 4901 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78" exitCode=0 Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.698470 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.698475 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.700017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.700106 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.700129 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.700341 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.701404 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.701450 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.701468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.703042 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="facb5cb4dfae77749307f248a75208528aedc677b507170b7657dac537ba35cb" exitCode=0 Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.703171 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"facb5cb4dfae77749307f248a75208528aedc677b507170b7657dac537ba35cb"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.703212 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.704688 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.704735 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.704755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.706216 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.706269 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.708602 4901 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c" exitCode=0 Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.708672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c"} Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.708833 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.710022 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.710082 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:35 crc kubenswrapper[4901]: I0202 10:38:35.710101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: W0202 10:38:36.524053 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:36 crc kubenswrapper[4901]: E0202 10:38:36.524160 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.579002 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.588085 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:26:05.045091576 +0000 UTC Feb 02 10:38:36 crc kubenswrapper[4901]: E0202 10:38:36.594086 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="3.2s" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.725834 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.725921 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.726044 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.727114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.727147 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.727161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.733629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.733673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.733687 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.733786 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.734636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.734666 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.734674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.738696 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.738725 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.738736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.738748 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.740966 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5a7e50f1526f6acb88dc5fb3d0145fb2bc8a5a6d778aff7eb4d41c0cf920db9f"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.741037 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.742002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.742026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.742036 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.743512 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fa7fc8f80da50698937afa7529176b094aeafef1a69da78093ca0983b08e09c5" exitCode=0 Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.743579 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fa7fc8f80da50698937afa7529176b094aeafef1a69da78093ca0983b08e09c5"} Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.743658 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.744423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.744448 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.744458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: W0202 10:38:36.864322 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:36 crc kubenswrapper[4901]: E0202 10:38:36.864428 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.894150 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.895959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.895999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.896012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:36 crc kubenswrapper[4901]: I0202 10:38:36.896047 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:36 crc kubenswrapper[4901]: E0202 10:38:36.896599 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.227:6443: connect: connection refused" node="crc" Feb 02 10:38:36 crc kubenswrapper[4901]: W0202 10:38:36.980999 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:36 crc kubenswrapper[4901]: E0202 10:38:36.981079 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:37 crc kubenswrapper[4901]: W0202 10:38:37.138245 4901 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.227:6443: connect: connection refused Feb 02 10:38:37 crc kubenswrapper[4901]: E0202 10:38:37.138350 4901 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.227:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.588716 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:46:29.194631209 +0000 UTC Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.749271 4901 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4214343fba39f507e5b2beadcbfc8ccdd526abbcfc93f4dd115c7f27101974fe" exitCode=0 Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.749354 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4214343fba39f507e5b2beadcbfc8ccdd526abbcfc93f4dd115c7f27101974fe"} Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.749461 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.750745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.750785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.750798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.757422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595"} Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.757555 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.757628 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.757672 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.758853 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.758895 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763699 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763669 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763748 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763898 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:37 crc kubenswrapper[4901]: I0202 10:38:37.763915 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.589937 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 11:10:46.192169796 +0000 UTC Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.750821 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.768513 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"466c7307c1600599f0bd0987a526afbae3d2225104db39883ff281a95e2fb4d6"} Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.768643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"599cc4f40cb3b64b89c9003cde53886e75b23e3a484c94adc7eec84795007662"} Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.768710 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.768750 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cfcaa168a201d284d5e182e674eaccf7a6862cc79525c5db964334e403552af"} Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.768654 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.770409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.770477 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:38 crc kubenswrapper[4901]: I0202 10:38:38.770503 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.353380 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.353723 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.355830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.355908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.355928 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.590907 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:39:38.435535074 +0000 UTC Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.779118 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c73e8307f02e603eac0fcc5c6414b53f779654684275da47fb71e8ef0d43bc0a"} Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.779173 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56cfbeb6e611dadca9958df41f83a4bb8df5402e7bf02bcb233fce14ed72f715"} Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.779268 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.779392 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.780949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.781010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.781027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.781314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.781382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.781410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:39 crc kubenswrapper[4901]: I0202 10:38:39.934471 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.096936 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.099002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.099096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.099132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.099186 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.284034 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.284864 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.287078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.287142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.287159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.294508 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.383287 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.591058 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 18:43:17.599857145 +0000 UTC Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.783329 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.783434 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.783469 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.785670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.785742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.785766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.786602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.786648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.786662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.786693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.786694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:40 crc kubenswrapper[4901]: I0202 10:38:40.787520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.591244 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 02:11:15.551890057 +0000 UTC Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.713362 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.786732 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.786834 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.788367 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.788436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:41 crc kubenswrapper[4901]: I0202 10:38:41.788459 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.116741 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.116997 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.118968 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.119021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.119037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.277271 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.591897 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:08:28.885847816 +0000 UTC Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.790204 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.791653 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.791702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:42 crc kubenswrapper[4901]: I0202 10:38:42.791715 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:43 crc kubenswrapper[4901]: I0202 10:38:43.592872 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:46:28.677111482 +0000 UTC Feb 02 10:38:43 crc kubenswrapper[4901]: E0202 10:38:43.788462 4901 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.227254 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.227644 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.229212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.229265 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.229285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.593695 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:51:49.876259646 +0000 UTC Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.713894 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:38:44 crc kubenswrapper[4901]: I0202 10:38:44.714004 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:38:45 crc kubenswrapper[4901]: I0202 10:38:45.594632 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:53:47.952127949 +0000 UTC Feb 02 10:38:46 crc kubenswrapper[4901]: I0202 10:38:46.595367 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:10:24.780690967 +0000 UTC Feb 02 10:38:47 crc kubenswrapper[4901]: I0202 10:38:47.247855 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 02 10:38:47 crc kubenswrapper[4901]: I0202 10:38:47.247921 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 02 10:38:47 crc kubenswrapper[4901]: I0202 10:38:47.579852 4901 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:38:47 crc kubenswrapper[4901]: I0202 10:38:47.596236 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 20:42:58.028742058 +0000 UTC Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.054706 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.054780 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.061680 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.061745 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.597417 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:10:16.125644463 +0000 UTC Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.787579 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]log ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]etcd ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/generic-apiserver-start-informers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-filter ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-apiextensions-informers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-apiextensions-controllers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/crd-informer-synced ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-system-namespaces-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 02 10:38:48 crc kubenswrapper[4901]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/bootstrap-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/start-kube-aggregator-informers ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-registration-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-discovery-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]autoregister-completion ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapi-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 02 10:38:48 crc kubenswrapper[4901]: livez check failed Feb 02 10:38:48 crc kubenswrapper[4901]: I0202 10:38:48.787636 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.427203 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.427438 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.429078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.429135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.429152 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.463529 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.597960 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 18:08:49.791297451 +0000 UTC Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.807529 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.808617 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.808644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.808652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:49 crc kubenswrapper[4901]: I0202 10:38:49.821107 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 02 10:38:50 crc kubenswrapper[4901]: I0202 10:38:50.598498 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:50:58.535525808 +0000 UTC Feb 02 10:38:50 crc kubenswrapper[4901]: I0202 10:38:50.809522 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:50 crc kubenswrapper[4901]: I0202 10:38:50.810656 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:50 crc kubenswrapper[4901]: I0202 10:38:50.810705 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:50 crc kubenswrapper[4901]: I0202 10:38:50.810714 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:51 crc kubenswrapper[4901]: I0202 10:38:51.598704 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 23:59:08.765072769 +0000 UTC Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.285212 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.285462 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.287243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.287314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.287333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:52 crc kubenswrapper[4901]: I0202 10:38:52.599085 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 13:17:49.694824638 +0000 UTC Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.058410 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.060935 4901 trace.go:236] Trace[574225083]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:38:41.998) (total time: 11062ms): Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[574225083]: ---"Objects listed" error: 11062ms (10:38:53.060) Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[574225083]: [11.062400748s] [11.062400748s] END Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.060952 4901 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.061943 4901 trace.go:236] Trace[856945099]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:38:40.864) (total time: 12197ms): Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[856945099]: ---"Objects listed" error: 12197ms (10:38:53.061) Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[856945099]: [12.197236428s] [12.197236428s] END Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.061968 4901 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.061924 4901 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.062142 4901 trace.go:236] Trace[132854091]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:38:42.863) (total time: 10198ms): Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[132854091]: ---"Objects listed" error: 10198ms (10:38:53.062) Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[132854091]: [10.19884578s] [10.19884578s] END Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.062164 4901 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.063153 4901 trace.go:236] Trace[726306171]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:38:41.287) (total time: 11775ms): Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[726306171]: ---"Objects listed" error: 11775ms (10:38:53.063) Feb 02 10:38:53 crc kubenswrapper[4901]: Trace[726306171]: [11.775152148s] [11.775152148s] END Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.063177 4901 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.064767 4901 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.076616 4901 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.092583 4901 csr.go:261] certificate signing request csr-v6pbk is approved, waiting to be issued Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.103818 4901 csr.go:257] certificate signing request csr-v6pbk is issued Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.411116 4901 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.411432 4901 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.411465 4901 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.411505 4901 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.411530 4901 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.411432 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.227:56608->38.102.83.227:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189067bcdc76d76c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:38:34.40689342 +0000 UTC m=+1.425233526,LastTimestamp:2026-02-02 10:38:34.40689342 +0000 UTC m=+1.425233526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.577140 4901 apiserver.go:52] "Watching apiserver" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.584913 4901 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.585328 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586108 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586216 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.586322 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.586397 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586512 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586611 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.586575 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.586870 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589403 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589541 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589750 4901 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589799 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589888 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.589981 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.590035 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.590072 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.590297 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.590407 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.599585 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:14:12.127127027 +0000 UTC Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.620925 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.639899 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.652529 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.667180 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669454 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669491 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669519 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669537 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669558 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669605 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669627 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669678 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669703 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669724 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669750 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669777 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669802 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669829 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669853 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669915 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669939 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669964 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669985 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670011 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670035 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670080 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670105 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670124 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670145 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670166 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670187 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670219 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670240 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670266 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670290 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670314 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670341 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670396 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670428 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670453 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670478 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670506 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670532 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670555 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670622 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670648 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670675 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670699 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670722 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670743 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670766 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670790 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670820 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670845 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670923 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670950 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670979 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671004 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671029 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671086 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671109 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671132 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671155 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671181 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671204 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671227 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671246 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671268 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671294 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671317 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671372 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671400 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671446 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671468 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671492 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671512 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671554 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671627 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671663 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669909 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672012 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672063 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671686 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672301 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672556 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672622 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669933 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.669974 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670223 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670242 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670279 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670397 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670489 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670488 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670558 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670607 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670703 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670745 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670915 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672840 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.670959 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671307 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671316 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.672882 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671459 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671526 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671739 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671961 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671986 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.671981 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673066 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673173 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673158 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673212 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673240 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673274 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673354 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673738 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673828 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673867 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673913 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673948 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.673973 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674000 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674027 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674120 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674120 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674153 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674153 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674180 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674205 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674232 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674259 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674268 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674283 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674311 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674339 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674366 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674390 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674416 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674441 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674464 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674487 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674513 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674539 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674581 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674608 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674631 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674652 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674674 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674553 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674698 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674822 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674874 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674902 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674918 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674962 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.674998 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675032 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675088 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675338 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675598 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675637 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675691 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675864 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.675884 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.676081 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.676095 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.676118 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.676155 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.676973 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.677351 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.677621 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.677723 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.677801 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.677860 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678087 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678190 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678652 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678678 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678835 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678933 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.678955 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.679116 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.679285 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.679415 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.679758 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.679778 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.681442 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.681557 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:38:54.181535613 +0000 UTC m=+21.199875709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.681946 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.682044 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.682263 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.682454 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.684122 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.684233 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.684246 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.693167 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.693366 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.693474 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.684332 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.684587 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.685503 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.685764 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.691984 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.693650 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.695214 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.695436 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696063 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696168 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696701 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696803 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696861 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696904 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696940 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.696980 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697021 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697100 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697137 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697199 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697256 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697335 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.697383 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698308 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698384 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698449 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698459 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698497 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698592 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698680 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698730 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698772 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698811 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698854 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698901 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698941 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.698981 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.699056 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701071 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701159 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701299 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702281 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.699082 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.701504 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702180 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.699108 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702328 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702355 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702400 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702438 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702472 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702499 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702529 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702554 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702586 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702626 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702674 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702703 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702705 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702755 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702779 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702804 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702856 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702880 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702907 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702932 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702957 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702985 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703009 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703057 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703083 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703104 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703130 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703152 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703737 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703768 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702756 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.702998 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703087 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703199 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.703693 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.704348 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705260 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705299 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705323 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705340 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705394 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705435 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705727 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705750 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705940 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.705861 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706233 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706242 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706494 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706501 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706490 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706604 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706539 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706715 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706778 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706885 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706935 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.706984 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707049 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707060 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707116 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707133 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707154 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707222 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707425 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707467 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707778 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707865 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.707975 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.708288 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.708400 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.708765 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.709166 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.709615 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.710026 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.710212 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.710391 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.710517 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.710987 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.711308 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.711407 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712377 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712587 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712754 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712891 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712952 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.712978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.713356 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.713491 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.713916 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.714437 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.714435 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.714765 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.714823 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.715004 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.715693 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.715857 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.716045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.716398 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.716617 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.716670 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.716695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717086 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717249 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717415 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717549 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717549 4901 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.717816 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.718127 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.718468 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.718760 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.718963 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725213 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725259 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725291 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725344 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725369 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725466 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725502 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725508 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725779 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725814 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725857 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725875 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725895 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725945 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.725958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.726167 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.726201 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.726223 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.726243 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.726555 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.726643 4901 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.726774 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:54.226635557 +0000 UTC m=+21.244975813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.727214 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.730172 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.735285 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:54.235259812 +0000 UTC m=+21.253599928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.737724 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738062 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738076 4901 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738088 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738098 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738106 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738117 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738125 4901 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738134 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738143 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738153 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738162 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738172 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738181 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738191 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738200 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738209 4901 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738218 4901 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738227 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738237 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738247 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738256 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738264 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738273 4901 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738281 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738290 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738299 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738308 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738317 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738325 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738334 4901 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738342 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738352 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738362 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738376 4901 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738386 4901 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738396 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738406 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738414 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738423 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738432 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738440 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738448 4901 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738457 4901 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738466 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738475 4901 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738483 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738491 4901 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738502 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738510 4901 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738520 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738530 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738538 4901 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738546 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738554 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738577 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738586 4901 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738597 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738605 4901 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738614 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738623 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738633 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738643 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738652 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738661 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738669 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738677 4901 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738686 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738695 4901 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738705 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738713 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738724 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738735 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738745 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738755 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738765 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738776 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738786 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738799 4901 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738809 4901 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738827 4901 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738838 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738848 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738859 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738867 4901 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738876 4901 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738884 4901 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738896 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738907 4901 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738917 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738927 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738937 4901 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738950 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738960 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738970 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738978 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738987 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.738998 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739028 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739038 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739050 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739061 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739073 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739085 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739096 4901 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739104 4901 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739112 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739120 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739129 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739137 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739149 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739160 4901 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739170 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739182 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739193 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739205 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739218 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739230 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739241 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739253 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739266 4901 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739279 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739291 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739310 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739321 4901 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739334 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739344 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739355 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739366 4901 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739376 4901 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739387 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739398 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.739411 4901 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.740028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.743091 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.743654 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.744437 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.745043 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.745193 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.745195 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.745407 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.747792 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.748459 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.749629 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.750079 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.750932 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.751131 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.751995 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.752880 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.753699 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.754977 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.756455 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.756995 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.758438 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.759089 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.760416 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.760963 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.762663 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.763186 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.764013 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.764176 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.764200 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.764233 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.764330 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:54.264289983 +0000 UTC m=+21.282630079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.764444 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.764908 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.765446 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.765536 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.766614 4901 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.766663 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.768089 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.768329 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.773371 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.768463 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.773545 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.773588 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.769965 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.769970 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.770270 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.770283 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.770464 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.770859 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: E0202 10:38:53.773663 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:54.273640845 +0000 UTC m=+21.291980941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.774668 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.774977 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.775590 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.780449 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.821289 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.821441 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.821493 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.822386 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.827327 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595" exitCode=255 Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.827520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595"} Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.836915 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.840873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841023 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841099 4901 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841118 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841131 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841144 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841183 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841264 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841277 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841287 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841296 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841307 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841336 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841346 4901 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841355 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841365 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841374 4901 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841384 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841412 4901 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841424 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841433 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841443 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841452 4901 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841654 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841664 4901 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841797 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841870 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841886 4901 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841903 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841943 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.841956 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842077 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842096 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842112 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842126 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842142 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842155 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842169 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842180 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842190 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842200 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842214 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842226 4901 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842238 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842249 4901 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842260 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842272 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842283 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842275 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842320 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842338 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842354 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842368 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842381 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842393 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842405 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842418 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842430 4901 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842442 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842455 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842470 4901 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842486 4901 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842500 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842512 4901 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842524 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.842758 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.857250 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.892940 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.900029 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.905725 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.914764 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.925733 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.928333 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0b9fe878459075c6fda5fbd3fb5bca71c8ae8c4694b6e1524cd1c76967849724 WatchSource:0}: Error finding container 0b9fe878459075c6fda5fbd3fb5bca71c8ae8c4694b6e1524cd1c76967849724: Status 404 returned error can't find the container with id 0b9fe878459075c6fda5fbd3fb5bca71c8ae8c4694b6e1524cd1c76967849724 Feb 02 10:38:53 crc kubenswrapper[4901]: W0202 10:38:53.929463 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5a059ded921536a0d8896b83b1cee1e112d39bce7abb20ada12b7de051e3989a WatchSource:0}: Error finding container 5a059ded921536a0d8896b83b1cee1e112d39bce7abb20ada12b7de051e3989a: Status 404 returned error can't find the container with id 5a059ded921536a0d8896b83b1cee1e112d39bce7abb20ada12b7de051e3989a Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.948604 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.962742 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:53 crc kubenswrapper[4901]: I0202 10:38:53.982830 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.003527 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.017733 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.036192 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.051604 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.105335 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-02 10:33:53 +0000 UTC, rotation deadline is 2026-10-25 03:27:01.987493571 +0000 UTC Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.105397 4901 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6352h48m7.882098827s for next certificate rotation Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.246109 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.246218 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.246264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.246340 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:38:55.24630873 +0000 UTC m=+22.264648866 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.246351 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.246442 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:55.246404132 +0000 UTC m=+22.264744348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.246544 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.246716 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:55.246684839 +0000 UTC m=+22.265024945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.347861 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.347910 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348103 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348131 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348149 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348144 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348196 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348210 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348236 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:55.348215516 +0000 UTC m=+22.366555612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.348279 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:55.348254937 +0000 UTC m=+22.366595033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.600016 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:15:50.624793716 +0000 UTC Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.676680 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.676812 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.831964 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5c224c1fffe2b027a09fb4cc3a832cf7f1131b8159636dbe8823dfec7816c271"} Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.833780 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf"} Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.833873 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0b9fe878459075c6fda5fbd3fb5bca71c8ae8c4694b6e1524cd1c76967849724"} Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.836236 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8"} Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.836296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2"} Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.836314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5a059ded921536a0d8896b83b1cee1e112d39bce7abb20ada12b7de051e3989a"} Feb 02 10:38:54 crc kubenswrapper[4901]: E0202 10:38:54.845048 4901 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.845452 4901 scope.go:117] "RemoveContainer" containerID="ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.849388 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mcs8s"] Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.849826 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.859721 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.868197 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.868308 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.869178 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.881908 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.907988 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.934423 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.952231 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8281e736-98f0-4282-b362-b55fd3d2810f-host\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.952283 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8281e736-98f0-4282-b362-b55fd3d2810f-serviceca\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.952328 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwxl\" (UniqueName: \"kubernetes.io/projected/8281e736-98f0-4282-b362-b55fd3d2810f-kube-api-access-9pwxl\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.965521 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:54 crc kubenswrapper[4901]: I0202 10:38:54.998599 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.014868 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.047051 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.053182 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8281e736-98f0-4282-b362-b55fd3d2810f-host\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.053246 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8281e736-98f0-4282-b362-b55fd3d2810f-serviceca\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.053297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwxl\" (UniqueName: \"kubernetes.io/projected/8281e736-98f0-4282-b362-b55fd3d2810f-kube-api-access-9pwxl\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.053319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8281e736-98f0-4282-b362-b55fd3d2810f-host\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.054423 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8281e736-98f0-4282-b362-b55fd3d2810f-serviceca\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.064182 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.078856 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwxl\" (UniqueName: \"kubernetes.io/projected/8281e736-98f0-4282-b362-b55fd3d2810f-kube-api-access-9pwxl\") pod \"node-ca-mcs8s\" (UID: \"8281e736-98f0-4282-b362-b55fd3d2810f\") " pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.091752 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.123427 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.146632 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.174620 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mcs8s" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.179680 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: W0202 10:38:55.185280 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8281e736_98f0_4282_b362_b55fd3d2810f.slice/crio-6a274c96c45dce380c2466dee8d3aac71690522e7d2e7164eb53d33acc104ca1 WatchSource:0}: Error finding container 6a274c96c45dce380c2466dee8d3aac71690522e7d2e7164eb53d33acc104ca1: Status 404 returned error can't find the container with id 6a274c96c45dce380c2466dee8d3aac71690522e7d2e7164eb53d33acc104ca1 Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.205527 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.223908 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.240494 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.255434 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.255947 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.256008 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.256068 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.256173 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:38:57.256093182 +0000 UTC m=+24.274433288 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.256214 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.256240 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:57.256230835 +0000 UTC m=+24.274570931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.256302 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:57.256275856 +0000 UTC m=+24.274616162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.261899 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.285026 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.303456 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f29d8"] Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.304322 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5xj56"] Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.304500 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.304663 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.306880 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.307113 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.307149 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.307686 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.307802 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.308000 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.308072 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.308968 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.324122 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.344961 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.356982 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/756c113d-5d5e-424e-bdf5-494b7774def6-mcd-auth-proxy-config\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357062 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34f1db7-7f2a-4252-af0b-49fa172495f9-hosts-file\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/756c113d-5d5e-424e-bdf5-494b7774def6-rootfs\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357096 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/756c113d-5d5e-424e-bdf5-494b7774def6-proxy-tls\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357122 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357139 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpmm\" (UniqueName: \"kubernetes.io/projected/e34f1db7-7f2a-4252-af0b-49fa172495f9-kube-api-access-twpmm\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.357154 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jllc\" (UniqueName: \"kubernetes.io/projected/756c113d-5d5e-424e-bdf5-494b7774def6-kube-api-access-8jllc\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357263 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357278 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357288 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357320 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:57.357309823 +0000 UTC m=+24.375649919 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357400 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357409 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357416 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.357434 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:38:57.357427966 +0000 UTC m=+24.375768062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.360504 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.385353 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.405012 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.419227 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.431405 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.446471 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.457769 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/756c113d-5d5e-424e-bdf5-494b7774def6-mcd-auth-proxy-config\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34f1db7-7f2a-4252-af0b-49fa172495f9-hosts-file\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458344 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/756c113d-5d5e-424e-bdf5-494b7774def6-rootfs\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458467 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/756c113d-5d5e-424e-bdf5-494b7774def6-proxy-tls\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458641 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpmm\" (UniqueName: \"kubernetes.io/projected/e34f1db7-7f2a-4252-af0b-49fa172495f9-kube-api-access-twpmm\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/756c113d-5d5e-424e-bdf5-494b7774def6-rootfs\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458386 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34f1db7-7f2a-4252-af0b-49fa172495f9-hosts-file\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458675 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/756c113d-5d5e-424e-bdf5-494b7774def6-mcd-auth-proxy-config\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.458763 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jllc\" (UniqueName: \"kubernetes.io/projected/756c113d-5d5e-424e-bdf5-494b7774def6-kube-api-access-8jllc\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.462429 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/756c113d-5d5e-424e-bdf5-494b7774def6-proxy-tls\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.465400 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.482275 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpmm\" (UniqueName: \"kubernetes.io/projected/e34f1db7-7f2a-4252-af0b-49fa172495f9-kube-api-access-twpmm\") pod \"node-resolver-5xj56\" (UID: \"e34f1db7-7f2a-4252-af0b-49fa172495f9\") " pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.483090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jllc\" (UniqueName: \"kubernetes.io/projected/756c113d-5d5e-424e-bdf5-494b7774def6-kube-api-access-8jllc\") pod \"machine-config-daemon-f29d8\" (UID: \"756c113d-5d5e-424e-bdf5-494b7774def6\") " pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.484280 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.499475 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.513044 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.526426 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.540793 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.561556 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.575302 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.595850 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.600162 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:36:03.129989083 +0000 UTC Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.612878 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.620600 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.627870 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xj56" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.632834 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.649495 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.677300 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.677431 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.677833 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:55 crc kubenswrapper[4901]: E0202 10:38:55.677895 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.682474 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.683518 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.684662 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.685194 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.685770 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.687038 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.687622 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.687940 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.691087 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.691852 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.692753 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.693282 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.694353 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.695095 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.696247 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.698859 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.699451 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.700614 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.701078 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.702649 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.703346 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.703986 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.705401 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.706209 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.707128 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.707643 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.708132 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.712746 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.713256 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.713922 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.718961 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.720439 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.720937 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-flw48"] Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.721517 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5q92h"] Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.721740 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.722828 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.728993 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.729172 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.729265 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.729354 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.729463 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.729543 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.742592 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761690 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cnibin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761736 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-os-release\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761754 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cni-binary-copy\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761775 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-bin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761794 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-hostroot\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761810 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-k8s-cni-cncf-io\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761859 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2pl\" (UniqueName: \"kubernetes.io/projected/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-kube-api-access-7q2pl\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761932 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761951 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-multus\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.761965 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-kubelet\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762012 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-os-release\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-system-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762092 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-socket-dir-parent\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762130 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-conf-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762155 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-system-cni-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762175 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-multus-certs\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762221 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-etc-kubernetes\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-netns\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762257 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-binary-copy\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762289 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnls4\" (UniqueName: \"kubernetes.io/projected/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-kube-api-access-wnls4\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762316 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cnibin\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.762333 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-daemon-config\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.763364 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.781881 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.794123 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.804718 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.817457 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.833644 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.845378 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mcs8s" event={"ID":"8281e736-98f0-4282-b362-b55fd3d2810f","Type":"ContainerStarted","Data":"0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.845435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mcs8s" event={"ID":"8281e736-98f0-4282-b362-b55fd3d2810f","Type":"ContainerStarted","Data":"6a274c96c45dce380c2466dee8d3aac71690522e7d2e7164eb53d33acc104ca1"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.847339 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.850222 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.850953 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.851366 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.852294 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xj56" event={"ID":"e34f1db7-7f2a-4252-af0b-49fa172495f9","Type":"ContainerStarted","Data":"70eb3a6b0506b417f7b67f1e34e430729b30cc5e873cc004f8c4cbb16cd9d378"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.854112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.854140 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"8b0838b6def036c1b26ad17dc7d42031f5478915ebec365b3ce9d59850c67bf2"} Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863715 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-k8s-cni-cncf-io\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2pl\" (UniqueName: \"kubernetes.io/projected/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-kube-api-access-7q2pl\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863781 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863801 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-kubelet\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-multus\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863861 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-system-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863886 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-os-release\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863904 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-socket-dir-parent\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863933 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-conf-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863951 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-multus-certs\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863970 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-system-cni-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.863987 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-etc-kubernetes\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864011 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-netns\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864028 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864047 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnls4\" (UniqueName: \"kubernetes.io/projected/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-kube-api-access-wnls4\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864065 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-binary-copy\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864121 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cnibin\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864141 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-daemon-config\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864159 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cni-binary-copy\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-bin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864191 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-hostroot\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cnibin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864224 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-os-release\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864436 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-os-release\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864478 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-k8s-cni-cncf-io\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-netns\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-etc-kubernetes\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.864933 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-system-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865023 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-os-release\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865237 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-socket-dir-parent\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865248 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-cni-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865287 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-conf-dir\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865285 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-multus\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865352 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-run-multus-certs\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865393 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-system-cni-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865400 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865410 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-kubelet\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865457 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-hostroot\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865451 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865501 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-host-var-lib-cni-bin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865618 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865659 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cnibin\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865720 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cnibin\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.865973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-multus-daemon-config\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.866274 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-cni-binary-copy\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.866553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-cni-binary-copy\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.887841 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnls4\" (UniqueName: \"kubernetes.io/projected/afbdc53a-f67a-44f7-a5bb-f446fb3706fb-kube-api-access-wnls4\") pod \"multus-additional-cni-plugins-flw48\" (UID: \"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\") " pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.888222 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.899554 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2pl\" (UniqueName: \"kubernetes.io/projected/19eb421a-49aa-4cde-ae5e-3aba70ee67f4-kube-api-access-7q2pl\") pod \"multus-5q92h\" (UID: \"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\") " pod="openshift-multus/multus-5q92h" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.905207 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.918731 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.930742 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.945111 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.957608 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.974140 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:55 crc kubenswrapper[4901]: I0202 10:38:55.986268 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.002824 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.019600 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.031773 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.048810 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.061154 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5q92h" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.063603 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.068811 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-flw48" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.084705 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.100784 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.126441 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.128623 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vm8h5"] Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.129450 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.142919 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.143473 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.143881 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.144124 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.145317 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.146031 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.146277 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.164637 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168111 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168168 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168199 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168222 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168269 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168297 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168324 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwwn\" (UniqueName: \"kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168356 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168386 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168413 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168455 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168490 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168517 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168544 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168614 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168640 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168668 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168697 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168730 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.168775 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.184012 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.206967 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.223758 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.238407 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.254327 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269715 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269774 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwwn\" (UniqueName: \"kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269823 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269852 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269876 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269918 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269939 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.269948 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270078 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270111 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270135 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270156 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270210 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270236 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270277 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270306 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270311 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270329 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270422 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270428 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270464 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270503 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270508 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270537 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270629 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270663 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270706 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270740 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270762 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270839 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.270862 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.271153 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.271392 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.273531 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.300230 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.311872 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwwn\" (UniqueName: \"kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn\") pod \"ovnkube-node-vm8h5\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.346622 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.389810 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.430845 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.466866 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.468933 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:38:56 crc kubenswrapper[4901]: W0202 10:38:56.481343 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3390481_846a_4742_9eae_0796b667897f.slice/crio-0054f47c122e18b40aa2c7c792663c19c5a9d2a27c75c0e3a58723113244d24a WatchSource:0}: Error finding container 0054f47c122e18b40aa2c7c792663c19c5a9d2a27c75c0e3a58723113244d24a: Status 404 returned error can't find the container with id 0054f47c122e18b40aa2c7c792663c19c5a9d2a27c75c0e3a58723113244d24a Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.511351 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.545410 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.595852 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.601318 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:53:03.914659981 +0000 UTC Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.633922 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.676373 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:56 crc kubenswrapper[4901]: E0202 10:38:56.676516 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.858523 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerStarted","Data":"c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.858586 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerStarted","Data":"89fc15a54f6da8ecb12576c6d4ea35a870c3711848ccda836bbf9b7432d4de00"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.859623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.861214 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xj56" event={"ID":"e34f1db7-7f2a-4252-af0b-49fa172495f9","Type":"ContainerStarted","Data":"4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.862929 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.864583 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" exitCode=0 Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.864630 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.864648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"0054f47c122e18b40aa2c7c792663c19c5a9d2a27c75c0e3a58723113244d24a"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.866935 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3" exitCode=0 Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.867450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.867477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerStarted","Data":"bbefc4b39ef85a3c301f1c98d4b672e77833363f5f6fa89201d670caf205b30d"} Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.880884 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.904441 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.925775 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.951584 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.970079 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.983888 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:56 crc kubenswrapper[4901]: I0202 10:38:56.995583 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.016885 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.032672 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.047553 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.064713 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.107187 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.148223 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.195815 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.228871 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.269892 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.281832 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.281938 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.281976 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.282027 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:39:01.281996708 +0000 UTC m=+28.300336844 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.282062 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.282115 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:01.28210163 +0000 UTC m=+28.300441726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.282251 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.282421 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:01.282382337 +0000 UTC m=+28.300722433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.304552 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.353499 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.382721 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.382806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.382994 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383025 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383028 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383082 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383041 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383104 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383187 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:01.383161616 +0000 UTC m=+28.401501902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.383212 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:01.383201117 +0000 UTC m=+28.401541423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.388503 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.425029 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.467455 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.507801 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.544674 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.586964 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.602179 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 04:38:49.273983906 +0000 UTC Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.624222 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.663979 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.707888 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.710716 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.710758 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.710918 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.711076 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.710677 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:57 crc kubenswrapper[4901]: E0202 10:38:57.711490 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.746995 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.872821 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.873284 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.873294 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.874939 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerStarted","Data":"b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f"} Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.897992 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.911916 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.928899 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.942201 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:57 crc kubenswrapper[4901]: I0202 10:38:57.954458 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.602557 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:26:18.216774109 +0000 UTC Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.795773 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.817214 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.881793 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.885149 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.910258 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.945219 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:58 crc kubenswrapper[4901]: I0202 10:38:58.964641 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.006478 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.030676 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.068935 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.462053 4901 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.464387 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.464423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.464434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.464596 4901 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.475543 4901 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.476138 4901 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.478156 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.478221 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.478245 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.478280 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.478309 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.504856 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.509486 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.509547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.509558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.509597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.509609 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.524713 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.529362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.529425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.529441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.529464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.529481 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.542476 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.546804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.546879 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.546893 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.546920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.546932 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.565369 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.570087 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.570146 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.570158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.570178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.570188 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.582374 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.582492 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.584583 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.584616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.584627 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.584643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.584654 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.603018 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 04:18:27.522500703 +0000 UTC Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.676749 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.676749 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.676764 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.676975 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.677112 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:38:59 crc kubenswrapper[4901]: E0202 10:38:59.677267 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.686844 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.686906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.686921 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.686942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.686957 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.790817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.790865 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.790875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.790895 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.790908 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.869236 4901 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.892200 4901 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.897297 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.897355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.897369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.897391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.897406 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:38:59Z","lastTransitionTime":"2026-02-02T10:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.898465 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.898515 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.900752 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f" exitCode=0 Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.900801 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f"} Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.918522 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.946667 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.963048 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.977400 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:38:59 crc kubenswrapper[4901]: I0202 10:38:59.993449 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:38:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.000780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.000834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.000849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.000868 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.000883 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.009175 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.026756 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.045816 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.061203 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.080948 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.098308 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.104970 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.105004 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.105016 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.105031 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.105041 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.114834 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.134321 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.159795 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.209706 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.209777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.209791 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.209812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.209828 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.313132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.313203 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.313222 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.313252 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.313287 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.417063 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.417125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.417141 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.417165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.417181 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.521211 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.521285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.521304 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.521334 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.521355 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.603850 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 13:06:46.312841594 +0000 UTC Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.630829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.630948 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.630969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.631005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.631024 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.734633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.734985 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.735095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.735201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.735299 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.838407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.838800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.838819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.838844 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.838864 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.911008 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944" exitCode=0 Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.911096 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.936884 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.942453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.942502 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.942520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.942546 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.942589 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:00Z","lastTransitionTime":"2026-02-02T10:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.957622 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.978329 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:00 crc kubenswrapper[4901]: I0202 10:39:00.998394 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.012997 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.029834 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.045181 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.046351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.046404 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.046422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.046446 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.046461 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.056936 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.069318 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.086433 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.101993 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.114646 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.135348 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.173584 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.173624 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.173636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.173653 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.173665 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.183026 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.275992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.276035 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.276047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.276067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.276079 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.374273 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.374428 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.374455 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:39:09.374426367 +0000 UTC m=+36.392766453 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.374528 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.374629 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.374637 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:09.374616491 +0000 UTC m=+36.392956587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.374828 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.374897 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:09.374887368 +0000 UTC m=+36.393227464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.378729 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.378760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.378770 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.378790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.378800 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.462668 4901 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.475406 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.475474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475666 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475713 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475726 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475735 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475780 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:09.47576502 +0000 UTC m=+36.494105116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475783 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475805 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.475881 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:09.475857172 +0000 UTC m=+36.494197268 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.481776 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.481811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.481823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.481841 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.481853 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.584275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.584327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.584338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.584355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.584366 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.604922 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:38:33.525732811 +0000 UTC Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.676449 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.676613 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.676711 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.676637 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.677032 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:01 crc kubenswrapper[4901]: E0202 10:39:01.677219 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.690838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.690896 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.690914 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.690940 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.690957 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.794338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.794433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.794461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.794498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.794525 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.898243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.898289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.898303 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.898324 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.898339 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:01Z","lastTransitionTime":"2026-02-02T10:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.921178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.924779 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6" exitCode=0 Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.924856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6"} Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.951630 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.967018 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:01 crc kubenswrapper[4901]: I0202 10:39:01.986667 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.001196 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.001266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.001282 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.001307 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.001323 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.008840 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.024893 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.037904 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.050022 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.065168 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.079749 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.097011 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.104441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.104483 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.104494 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.104515 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.104529 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.112719 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.133031 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.149870 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.164307 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.207555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.207640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.207650 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.207669 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.207680 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.311665 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.311716 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.311725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.311743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.311754 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.414493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.414591 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.414606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.414630 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.414650 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.518055 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.518149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.518169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.518187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.518198 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.605320 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:24:44.217730122 +0000 UTC Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.620913 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.620986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.621005 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.621034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.621053 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.723587 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.723638 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.723649 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.723667 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.723680 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.829115 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.829174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.829185 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.829202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.829215 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.933717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.933792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.933815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.933846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.933871 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:02Z","lastTransitionTime":"2026-02-02T10:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.938022 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9" exitCode=0 Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.938088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9"} Feb 02 10:39:02 crc kubenswrapper[4901]: I0202 10:39:02.962827 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:02Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.007374 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.026113 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.037797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.037866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.037881 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.037906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.037926 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.044530 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.062453 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.079378 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.102987 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.146048 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.146097 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.146107 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.146127 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.146141 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.158630 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.175857 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.189136 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.203323 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.217357 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.231418 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.243549 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.248580 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.248628 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.248640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.248662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.248680 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.351469 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.351551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.351613 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.351653 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.351678 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.455001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.455299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.455310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.455332 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.455346 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.559535 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.559628 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.559646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.559674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.559698 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.606254 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 01:48:53.030085451 +0000 UTC Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.663149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.663216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.663232 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.663258 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.663275 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.676664 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.676672 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:03 crc kubenswrapper[4901]: E0202 10:39:03.676860 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:03 crc kubenswrapper[4901]: E0202 10:39:03.677113 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.677841 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:03 crc kubenswrapper[4901]: E0202 10:39:03.677995 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.693844 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.713250 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.730507 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.750096 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766484 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.766848 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.782585 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.804448 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.819224 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.838625 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.851979 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.867889 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.872025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.872080 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.872091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.872110 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.872125 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.887381 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.903861 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.924686 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.948359 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.948815 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.953204 4901 generic.go:334] "Generic (PLEG): container finished" podID="afbdc53a-f67a-44f7-a5bb-f446fb3706fb" containerID="259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b" exitCode=0 Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.953238 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerDied","Data":"259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.965007 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.976436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.976480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.976489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.976505 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.976515 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:03Z","lastTransitionTime":"2026-02-02T10:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.981074 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:03 crc kubenswrapper[4901]: I0202 10:39:03.986362 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.005743 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.023448 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.036490 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.047542 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.064865 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.079432 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.079989 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.080021 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.080034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.080053 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.080067 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.090151 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.104311 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.120837 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.140876 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.154812 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.170350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.183293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.183323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.183333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.183350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.183360 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.184024 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.199728 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.213105 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.230573 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.254770 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.268689 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.282044 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.286243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.286278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.286288 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.286333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.286349 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.297122 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.314153 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.327809 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.344343 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.356447 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.367531 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.378363 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.390270 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.390326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.390335 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.390352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.390362 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.493518 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.493550 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.493558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.493590 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.493598 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.597507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.598135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.598150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.598174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.598190 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.607428 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:46:56.716830268 +0000 UTC Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.701551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.701652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.701674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.701711 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.701734 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.805276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.805341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.805365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.805397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.805423 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.821254 4901 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.908498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.908542 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.908558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.908644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.908675 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:04Z","lastTransitionTime":"2026-02-02T10:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.960400 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.961959 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" event={"ID":"afbdc53a-f67a-44f7-a5bb-f446fb3706fb","Type":"ContainerStarted","Data":"4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0"} Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.962038 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.983327 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.989781 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:04 crc kubenswrapper[4901]: I0202 10:39:04.997198 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.007790 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.011337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.011363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.011372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.011385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.011394 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.022275 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.041544 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.054645 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.066201 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.077770 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.090695 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.102679 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.119877 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.119919 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.120056 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.120085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.120098 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.121210 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.136321 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.152332 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.168717 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.191340 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.217399 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.223399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.223438 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.223449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.223466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.223476 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.238007 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.255515 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.271851 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.285657 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.300799 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.314831 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.327608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.327675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.327686 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.327706 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.327716 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.328077 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.356653 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.369788 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.387115 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.407709 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.420350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.430911 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.430980 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.430995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.431025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.431042 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.534621 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.534677 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.534696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.534719 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.534738 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.608455 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 08:31:26.104117678 +0000 UTC Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.637923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.637975 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.637989 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.638012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.638029 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.676460 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.676529 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.676615 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:05 crc kubenswrapper[4901]: E0202 10:39:05.676674 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:05 crc kubenswrapper[4901]: E0202 10:39:05.676761 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:05 crc kubenswrapper[4901]: E0202 10:39:05.676932 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.741364 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.741403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.741414 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.741431 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.741442 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.843812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.843890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.843909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.843934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.843952 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.947548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.947631 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.947645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.947668 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.947679 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:05Z","lastTransitionTime":"2026-02-02T10:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:05 crc kubenswrapper[4901]: I0202 10:39:05.963447 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.049976 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.050016 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.050025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.050039 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.050049 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.152745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.152797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.152809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.152828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.152843 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.255351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.255408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.255421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.255440 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.255456 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.358548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.358645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.358657 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.358678 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.358691 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.461382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.461457 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.461476 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.461507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.461528 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.563890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.563930 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.563939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.563970 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.563981 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.609456 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 08:59:23.857039168 +0000 UTC Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.667493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.667862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.668115 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.668381 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.668646 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.773263 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.773301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.773309 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.773323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.773333 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.875957 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.876032 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.876058 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.876148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.876173 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.971172 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/0.log" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978057 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d" exitCode=1 Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978148 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978724 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.978804 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:06Z","lastTransitionTime":"2026-02-02T10:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.979893 4901 scope.go:117] "RemoveContainer" containerID="3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d" Feb 02 10:39:06 crc kubenswrapper[4901]: I0202 10:39:06.999550 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:06Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.031842 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:06Z\\\",\\\"message\\\":\\\"or *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616188 6175 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616517 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:39:06.616590 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:39:06.616629 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:06.616625 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:39:06.616653 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:39:06.616667 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:06.616721 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:06.616767 6175 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:06.616783 6175 factory.go:656] Stopping watch factory\\\\nI0202 10:39:06.616784 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:39:06.616804 6175 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:39:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.052640 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.071185 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.081630 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.081679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.081693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.081711 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.081724 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.089067 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.102026 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.115276 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.132482 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.148576 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.160308 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.174255 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.184742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.184791 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.184801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.184820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.184832 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.189671 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.207105 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.218532 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.252434 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.268859 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.283409 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.289285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.289338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.289357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.289384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.289404 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.298385 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.315031 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.335695 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.353749 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.373545 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.387875 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.392751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.392787 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.392803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.392823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.392835 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.405288 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.419859 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.434484 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.445468 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.458374 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.476922 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:06Z\\\",\\\"message\\\":\\\"or *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616188 6175 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616517 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:39:06.616590 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:39:06.616629 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:06.616625 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:39:06.616653 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:39:06.616667 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:06.616721 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:06.616767 6175 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:06.616783 6175 factory.go:656] Stopping watch factory\\\\nI0202 10:39:06.616784 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:39:06.616804 6175 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:39:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.495480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.495540 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.495555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.495608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.495627 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.513826 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.598112 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.598408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.598416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.598430 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.598440 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.610581 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 22:45:51.402124187 +0000 UTC Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.676117 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.676117 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.676277 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:07 crc kubenswrapper[4901]: E0202 10:39:07.676350 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:07 crc kubenswrapper[4901]: E0202 10:39:07.676431 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:07 crc kubenswrapper[4901]: E0202 10:39:07.676604 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.701302 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.701351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.701363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.701380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.701391 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.804380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.804426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.804434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.804451 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.804460 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.907279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.907324 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.907337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.907352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.907362 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:07Z","lastTransitionTime":"2026-02-02T10:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.983355 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/0.log" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.985531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687"} Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.986158 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:07 crc kubenswrapper[4901]: I0202 10:39:07.998587 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.010239 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.010292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.010303 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.010322 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.010334 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.012271 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.025262 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.042652 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.060906 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.072526 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.083667 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.095485 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.108774 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.112724 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.112774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.112784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.112802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.112818 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.121777 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.138979 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.159823 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:06Z\\\",\\\"message\\\":\\\"or *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616188 6175 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616517 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:39:06.616590 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:39:06.616629 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:06.616625 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:39:06.616653 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:39:06.616667 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:06.616721 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:06.616767 6175 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:06.616783 6175 factory.go:656] Stopping watch factory\\\\nI0202 10:39:06.616784 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:39:06.616804 6175 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:39:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.173580 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.187234 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:08Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.215341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.215382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.215392 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.215410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.215420 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.318287 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.318329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.318339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.318356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.318366 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.421988 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.422048 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.422066 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.422093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.422112 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.524767 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.524801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.524810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.524823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.524832 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.610954 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 04:00:56.576837055 +0000 UTC Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.627042 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.627111 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.627128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.627157 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.627177 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.730546 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.730674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.730702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.730742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.730764 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.833500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.833610 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.833633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.833661 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.833679 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.937102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.937155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.937165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.937186 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.937203 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:08Z","lastTransitionTime":"2026-02-02T10:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.992248 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/1.log" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.992987 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/0.log" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.997651 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687" exitCode=1 Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.997711 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687"} Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.997809 4901 scope.go:117] "RemoveContainer" containerID="3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d" Feb 02 10:39:08 crc kubenswrapper[4901]: I0202 10:39:08.998886 4901 scope.go:117] "RemoveContainer" containerID="d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687" Feb 02 10:39:08 crc kubenswrapper[4901]: E0202 10:39:08.999222 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.019684 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.039691 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.039754 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.039763 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.039779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.039788 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.041185 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.055374 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22"] Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.056704 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.058457 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.059894 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.063345 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.064095 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.064180 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.064310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhwqj\" (UniqueName: \"kubernetes.io/projected/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-kube-api-access-jhwqj\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.065130 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.077985 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.094651 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.109818 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.122762 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.142818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.142910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.142923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.142943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.142958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.145397 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.165765 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.165846 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.165878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhwqj\" (UniqueName: \"kubernetes.io/projected/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-kube-api-access-jhwqj\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.165937 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.165735 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.166704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.170198 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.175302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.182177 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.189984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhwqj\" (UniqueName: \"kubernetes.io/projected/d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e-kube-api-access-jhwqj\") pod \"ovnkube-control-plane-749d76644c-s8l22\" (UID: \"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.201379 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.215000 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.231972 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.246371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.246454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.246474 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.246518 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.246544 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.271905 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:06Z\\\",\\\"message\\\":\\\"or *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616188 6175 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616517 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:39:06.616590 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:39:06.616629 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:06.616625 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:39:06.616653 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:39:06.616667 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:06.616721 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:06.616767 6175 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:06.616783 6175 factory.go:656] Stopping watch factory\\\\nI0202 10:39:06.616784 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:39:06.616804 6175 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:39:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.291266 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f9e19e65c6e9ad94f61cc5ea201053315506c0ce2bea6653fb00f9fcbbdbc8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:06Z\\\",\\\"message\\\":\\\"or *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616188 6175 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:39:06.616517 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:39:06.616590 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:39:06.616629 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:06.616625 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:39:06.616653 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:39:06.616667 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:06.616721 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:06.616767 6175 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:06.616783 6175 factory.go:656] Stopping watch factory\\\\nI0202 10:39:06.616784 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:39:06.616804 6175 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:39:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.306968 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.319552 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.332926 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.348855 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.349237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.349309 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.349333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.349369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.349393 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.365530 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.373285 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.382144 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.396399 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.409317 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.426807 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.442064 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.451403 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.451454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.451466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.451489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.451505 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.455694 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.468733 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.468877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.468953 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.468985 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.469047 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.469080 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.469061767 +0000 UTC m=+52.487401863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.469104 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.469094808 +0000 UTC m=+52.487434904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.469195 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.469226 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.469219591 +0000 UTC m=+52.487559687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.483366 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.495782 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.555431 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.555505 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.555515 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.555531 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.555606 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.569511 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.569594 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569715 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569757 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569769 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569819 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.569802966 +0000 UTC m=+52.588143062 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569724 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569869 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569883 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.569939 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.569921409 +0000 UTC m=+52.588261575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.611495 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 14:43:19.119038992 +0000 UTC Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.658555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.658635 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.658648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.658670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.658686 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.676021 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.676068 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.676043 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.676181 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.676310 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.676372 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.762128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.762374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.762388 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.762406 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.762419 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.816875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.816929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.816938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.816958 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.817001 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.829547 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.833324 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.833374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.833386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.833405 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.833419 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.853745 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.857514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.857592 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.857605 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.857621 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.857633 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.877891 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.882746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.882809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.882833 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.882868 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.882894 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.914909 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.920254 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.920314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.920325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.920341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.920691 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.934409 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:09 crc kubenswrapper[4901]: E0202 10:39:09.934532 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.936461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.936516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.936529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.936548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:09 crc kubenswrapper[4901]: I0202 10:39:09.936581 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:09Z","lastTransitionTime":"2026-02-02T10:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.003595 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/1.log" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.007396 4901 scope.go:117] "RemoveContainer" containerID="d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687" Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.007553 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.009595 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" event={"ID":"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e","Type":"ContainerStarted","Data":"96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.009628 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" event={"ID":"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e","Type":"ContainerStarted","Data":"3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.009638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" event={"ID":"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e","Type":"ContainerStarted","Data":"ba1838e0998225a3210bcf1da5772b07df7ac14ad2caeae7213e589c3a28f291"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.025077 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.038773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.038819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.038829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.038847 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.038859 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.041424 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.056709 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.070773 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.087669 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.104387 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.118794 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.132043 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.141080 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.141138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.141150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.141169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.141183 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.143831 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.158846 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.176036 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.176213 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fmjwg"] Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.177032 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.177135 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.201390 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.218349 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.232803 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.244969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.245022 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.245037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.245061 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.245078 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.248843 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.266794 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.275832 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtz59\" (UniqueName: \"kubernetes.io/projected/b96d903e-a64c-4321-8963-482d4b579e30-kube-api-access-qtz59\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.275934 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.298617 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.320450 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.334278 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.348370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.348416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.348428 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.348447 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.348461 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.350423 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.365030 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.375029 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.376458 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtz59\" (UniqueName: \"kubernetes.io/projected/b96d903e-a64c-4321-8963-482d4b579e30-kube-api-access-qtz59\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.376520 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.376760 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.376839 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:10.876816881 +0000 UTC m=+37.895156977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.390901 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.396818 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtz59\" (UniqueName: \"kubernetes.io/projected/b96d903e-a64c-4321-8963-482d4b579e30-kube-api-access-qtz59\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.409163 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.422774 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.434202 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.445913 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.450949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.451001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.451014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.451060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.451076 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.458230 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.472399 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.484591 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.495682 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.554075 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.554147 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.554165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.554192 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.554212 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.612619 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 12:25:39.035516385 +0000 UTC Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.657584 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.657671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.657697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.657730 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.657749 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.761030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.761096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.761114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.761140 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.761161 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.864086 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.864169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.864197 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.864230 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.864258 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.883072 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.883298 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:10 crc kubenswrapper[4901]: E0202 10:39:10.883388 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:11.883365591 +0000 UTC m=+38.901705697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.968284 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.968347 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.968357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.968378 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:10 crc kubenswrapper[4901]: I0202 10:39:10.968389 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:10Z","lastTransitionTime":"2026-02-02T10:39:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.072816 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.073243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.073256 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.073270 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.073280 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.176544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.176620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.176633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.176652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.176667 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.279468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.279526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.279536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.279575 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.279587 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.382989 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.383057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.383081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.383117 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.383140 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.485814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.485881 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.485901 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.485927 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.485944 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.589259 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.589310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.589319 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.589335 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.589344 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.613150 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 16:36:55.069779253 +0000 UTC Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.676624 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.676782 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.676851 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.676851 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.677071 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.676902 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.677229 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.677292 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.692828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.692897 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.692918 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.692944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.692961 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.796079 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.796153 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.796172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.796207 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.796226 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.895496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.895761 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:11 crc kubenswrapper[4901]: E0202 10:39:11.895860 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:13.895832348 +0000 UTC m=+40.914172454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.900237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.900307 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.900327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.900355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:11 crc kubenswrapper[4901]: I0202 10:39:11.900374 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:11Z","lastTransitionTime":"2026-02-02T10:39:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.004544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.004654 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.004682 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.004732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.004752 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.109350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.109452 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.109478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.109512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.109537 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.212595 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.212901 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.212994 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.213099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.213198 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.317616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.317707 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.317728 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.317760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.317783 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.421846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.422371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.422472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.422617 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.422734 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.526951 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.527035 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.527061 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.527099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.527306 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.613444 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:26:00.358194866 +0000 UTC Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.630954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.631013 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.631025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.631046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.631060 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.734845 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.734943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.734962 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.734995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.735017 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.838712 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.838756 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.838764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.838781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.838791 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.942251 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.942693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.942805 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.942910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:12 crc kubenswrapper[4901]: I0202 10:39:12.943008 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:12Z","lastTransitionTime":"2026-02-02T10:39:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.046332 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.046409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.046423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.046445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.046458 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.150028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.150131 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.150149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.150170 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.150183 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.253124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.253217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.253244 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.253286 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.253317 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.357483 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.357538 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.357551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.357606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.357624 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.461313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.461394 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.461413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.461460 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.461481 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.564778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.564858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.564880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.564905 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.564923 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.613661 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:28:29.578597438 +0000 UTC Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.668473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.668535 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.668557 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.668626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.668655 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.675763 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.675927 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.676091 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.676309 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.676454 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.676771 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.676853 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.677044 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.709630 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.728784 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.748923 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.767752 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.771620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.771667 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.771681 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.771700 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.771711 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.791437 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.808093 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.819946 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.837920 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.852437 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.871336 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.874233 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.874300 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.874321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.874357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.874379 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.888962 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.904779 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.921289 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.921532 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:13 crc kubenswrapper[4901]: E0202 10:39:13.921636 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:17.92161367 +0000 UTC m=+44.939953776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.952608 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.968212 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.979732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.979774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.979788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.979809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.979821 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:13Z","lastTransitionTime":"2026-02-02T10:39:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:13 crc kubenswrapper[4901]: I0202 10:39:13.992293 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:13Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.008777 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:14Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.084011 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.084095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.084109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.084154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.084170 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.187492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.187590 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.187605 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.187627 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.187639 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.291035 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.291088 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.291100 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.291120 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.291133 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.393247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.393301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.393309 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.393323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.393332 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.496479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.496545 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.496641 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.496680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.496707 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.599775 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.599843 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.599860 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.599887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.599905 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.614383 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:21:09.170240918 +0000 UTC Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.702594 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.702687 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.702716 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.702758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.702788 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.811331 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.811416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.811436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.811467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.811488 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.915371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.915438 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.915456 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.915484 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:14 crc kubenswrapper[4901]: I0202 10:39:14.915505 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:14Z","lastTransitionTime":"2026-02-02T10:39:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.019482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.019529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.019541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.019586 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.019600 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.122533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.122582 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.122591 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.122604 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.122613 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.225785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.225853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.225870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.225893 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.225911 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.330121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.330189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.330199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.330217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.330228 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.433810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.433923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.433949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.433983 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.434012 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.539636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.539721 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.539740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.539766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.539784 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.615132 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 05:00:00.1598489 +0000 UTC Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.643214 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.643300 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.643315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.643341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.643363 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.676913 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.677062 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:15 crc kubenswrapper[4901]: E0202 10:39:15.677113 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.676927 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.677169 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:15 crc kubenswrapper[4901]: E0202 10:39:15.677319 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:15 crc kubenswrapper[4901]: E0202 10:39:15.677547 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:15 crc kubenswrapper[4901]: E0202 10:39:15.677704 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.747003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.747090 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.747114 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.747146 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.747172 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.851150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.851224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.851237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.851259 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.851286 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.954795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.954881 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.954909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.954945 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:15 crc kubenswrapper[4901]: I0202 10:39:15.954972 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:15Z","lastTransitionTime":"2026-02-02T10:39:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.058160 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.058252 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.058279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.058310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.058330 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.161837 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.161974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.161993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.162020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.162040 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.266262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.266353 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.266377 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.266405 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.266430 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.370935 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.371020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.371040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.371068 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.371091 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.474719 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.474769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.474785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.474811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.474829 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.579855 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.579936 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.579960 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.579993 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.580014 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.615514 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 01:28:21.907413418 +0000 UTC Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.683427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.683495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.683618 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.683656 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.683681 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.787208 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.787283 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.787306 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.787340 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.787364 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.890863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.890946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.890965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.890995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.891015 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.994454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.994510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.994518 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.994532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:16 crc kubenswrapper[4901]: I0202 10:39:16.994540 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:16Z","lastTransitionTime":"2026-02-02T10:39:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.097674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.097728 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.097736 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.097755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.097765 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.206376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.206453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.206495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.207276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.207332 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.311987 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.312075 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.312101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.312136 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.312164 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.414774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.414809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.414817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.414831 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.414840 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.517651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.517707 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.517723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.517746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.517763 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.616802 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 07:53:20.170483724 +0000 UTC Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.621741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.621812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.621840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.621874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.621899 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.675997 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.676141 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.676476 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.676708 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.676846 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.677090 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.676878 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.677301 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.725916 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.725967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.725981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.726003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.726020 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.829466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.829535 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.829552 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.829608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.829625 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.932866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.932913 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.932932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.932967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.932993 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:17Z","lastTransitionTime":"2026-02-02T10:39:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:17 crc kubenswrapper[4901]: I0202 10:39:17.974458 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.974683 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:17 crc kubenswrapper[4901]: E0202 10:39:17.974739 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:25.97472234 +0000 UTC m=+52.993062446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.036211 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.036269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.036287 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.036308 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.036322 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.142137 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.142199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.142216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.142240 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.142257 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.246375 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.246424 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.246439 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.246460 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.246472 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.350176 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.350232 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.350248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.350275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.350293 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.454671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.454752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.454769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.454795 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.454812 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.558260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.558374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.558407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.558442 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.558465 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.618088 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:48:54.493677842 +0000 UTC Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.661589 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.661641 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.661653 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.661671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.661687 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.768364 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.768436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.768467 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.768501 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.768525 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.871159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.871200 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.871209 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.871222 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:18 crc kubenswrapper[4901]: I0202 10:39:18.871231 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:18Z","lastTransitionTime":"2026-02-02T10:39:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.001255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.001310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.001326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.001350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.001367 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.104810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.104875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.104890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.104911 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.104927 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.208372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.208461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.208486 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.208516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.208541 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.311992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.312067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.312081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.312105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.312118 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.414550 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.414626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.414644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.414669 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.414685 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.519281 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.519343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.519365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.519395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.519415 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.619478 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 21:12:00.919559709 +0000 UTC Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.623107 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.623159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.623176 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.623201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.623219 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.676445 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:19 crc kubenswrapper[4901]: E0202 10:39:19.676651 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.676678 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.676753 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:19 crc kubenswrapper[4901]: E0202 10:39:19.676841 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.676903 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:19 crc kubenswrapper[4901]: E0202 10:39:19.677018 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:19 crc kubenswrapper[4901]: E0202 10:39:19.677119 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.726317 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.726379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.726389 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.726410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.726422 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.829065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.829148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.829169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.829201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.829223 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.932242 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.932330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.932351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.932381 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:19 crc kubenswrapper[4901]: I0202 10:39:19.932404 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:19Z","lastTransitionTime":"2026-02-02T10:39:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.014344 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.014397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.014414 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.014433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.014445 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.027359 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.032342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.032379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.032391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.032407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.032417 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.047332 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.053471 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.053518 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.053534 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.053553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.053582 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.071355 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.074818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.074846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.074858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.074871 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.074881 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.093008 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.097011 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.097073 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.097099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.097126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.097146 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.114171 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:20 crc kubenswrapper[4901]: E0202 10:39:20.114388 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.116730 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.116773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.116784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.116798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.116809 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.219189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.219244 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.219260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.219284 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.219302 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.322633 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.322672 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.322681 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.322694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.322704 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.425210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.425260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.425274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.425293 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.425307 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.528492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.528740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.528769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.528798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.528823 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.619685 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 19:47:56.759778653 +0000 UTC Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.632295 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.632338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.632349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.632365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.632381 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.677691 4901 scope.go:117] "RemoveContainer" containerID="d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.737889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.737964 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.737984 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.738011 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.738030 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.841135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.841213 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.841232 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.841260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.841284 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.955658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.955723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.955734 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.955752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:20 crc kubenswrapper[4901]: I0202 10:39:20.955769 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:20Z","lastTransitionTime":"2026-02-02T10:39:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057486 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.057465 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/1.log" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.060140 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.060610 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.080655 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.098473 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.110861 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.132971 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.161473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.161518 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.161533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.161553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.161790 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.171359 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.189435 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.208249 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.219296 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.235923 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.256047 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.264179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.264232 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.264247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.264268 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.264282 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.275061 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.293382 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.307553 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.319772 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.330993 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.342740 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.366424 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.366485 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.366497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.366513 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.366523 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.469142 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.469190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.469201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.469216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.469225 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.571676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.571727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.571738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.571760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.571773 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.620649 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 20:18:16.482202421 +0000 UTC Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.674445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.674533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.674555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.674610 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.674634 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.675726 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.675808 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:21 crc kubenswrapper[4901]: E0202 10:39:21.675865 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.675881 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.675948 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:21 crc kubenswrapper[4901]: E0202 10:39:21.676060 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:21 crc kubenswrapper[4901]: E0202 10:39:21.676204 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:21 crc kubenswrapper[4901]: E0202 10:39:21.676313 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.777154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.777207 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.777219 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.777236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.777248 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.879752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.879792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.879802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.879836 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.879847 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.982328 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.982393 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.982413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.982439 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:21 crc kubenswrapper[4901]: I0202 10:39:21.982458 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:21Z","lastTransitionTime":"2026-02-02T10:39:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.069260 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/2.log" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.070464 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/1.log" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.075830 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" exitCode=1 Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.075934 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.076030 4901 scope.go:117] "RemoveContainer" containerID="d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.077221 4901 scope.go:117] "RemoveContainer" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" Feb 02 10:39:22 crc kubenswrapper[4901]: E0202 10:39:22.077643 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.087473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.087510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.087521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.087539 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.087551 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.100950 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.119883 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.131444 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.151525 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.156682 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.173688 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.189386 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.191125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.191177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.191192 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.191213 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.191226 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.210041 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.227900 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.245771 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.261253 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.276391 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.291724 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.293458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.293599 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.293697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.293786 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.293918 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.311472 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.328403 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.341112 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.360686 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.384866 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.396906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.396943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.396955 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.396974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.396987 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.409652 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.436239 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d673fed3e933fa3cd039b3d8aff6adf81bd46d268ecac3d02e64c9bf7a8af687\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:08Z\\\",\\\"message\\\":\\\"flector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:39:08.234141 6320 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:39:08.234210 6320 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:39:08.234242 6320 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:39:08.234245 6320 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:39:08.234261 6320 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:39:08.234265 6320 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:39:08.234277 6320 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:39:08.234287 6320 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:39:08.234295 6320 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:39:08.234298 6320 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:39:08.234309 6320 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:39:08.234312 6320 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:39:08.234325 6320 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:39:08.234353 6320 factory.go:656] Stopping watch factory\\\\nI0202 10:39:08.234357 6320 handler.go:208] Removed *v1.NetworkPolicy ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.456093 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.472278 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.492986 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.499296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.499337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.499351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.499374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.499390 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.514750 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.528531 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.542155 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.556829 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.572538 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.590441 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.603066 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.603125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.603145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.603170 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.603187 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.604638 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.620546 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.620806 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 10:15:58.235722015 +0000 UTC Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.634391 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.649956 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.667658 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.686904 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.706268 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.706315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.706326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.706345 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.706360 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.808822 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.808900 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.808918 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.808933 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.808942 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.910779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.910850 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.910862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.910876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:22 crc kubenswrapper[4901]: I0202 10:39:22.910886 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:22Z","lastTransitionTime":"2026-02-02T10:39:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.013919 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.013991 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.014002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.014017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.014029 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.083078 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/2.log" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.089329 4901 scope.go:117] "RemoveContainer" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" Feb 02 10:39:23 crc kubenswrapper[4901]: E0202 10:39:23.089649 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.116335 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.116805 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.116853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.118119 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.118209 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.118230 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.146943 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.165763 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.183266 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.200467 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.223773 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.229634 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.229662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.229672 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.229690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.229702 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.242305 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.254967 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.269663 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.281995 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.295203 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.309542 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.323453 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.333163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.333350 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.333455 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.333558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.333677 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.336205 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.353906 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.368067 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.383908 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.437248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.437299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.437310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.437328 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.437339 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.540904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.540980 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.540998 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.541019 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.541034 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.621881 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:10:19.945113488 +0000 UTC Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.644699 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.644802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.644827 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.644867 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.644896 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.676388 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.676496 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.676716 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:23 crc kubenswrapper[4901]: E0202 10:39:23.676696 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:23 crc kubenswrapper[4901]: E0202 10:39:23.676903 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.676963 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:23 crc kubenswrapper[4901]: E0202 10:39:23.677038 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:23 crc kubenswrapper[4901]: E0202 10:39:23.677252 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.691520 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.704055 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.719737 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.730908 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.748155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.748198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.748212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.748231 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.748245 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.750284 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.766762 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.778462 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.797211 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.816488 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.830819 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.844754 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.850506 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.850744 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.850884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.851083 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.851223 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.861751 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.875513 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.895057 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.911925 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.938245 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.954463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.954508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.954520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.954541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.954555 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:23Z","lastTransitionTime":"2026-02-02T10:39:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:23 crc kubenswrapper[4901]: I0202 10:39:23.961443 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.057797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.057864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.057883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.057914 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.057932 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.161684 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.161834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.161855 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.161884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.161906 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.264992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.265028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.265040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.265059 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.265070 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.368617 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.368680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.368694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.368713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.368729 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.471172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.471234 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.471244 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.471266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.471279 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.575696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.575771 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.575792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.575821 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.575843 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.622539 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 02:50:57.741247883 +0000 UTC Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.679757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.679853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.679874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.679904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.679926 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.783215 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.783296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.783314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.783340 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.783358 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.886822 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.886888 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.886905 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.886959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.886978 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.990340 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.990405 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.990427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.990455 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:24 crc kubenswrapper[4901]: I0202 10:39:24.990488 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:24Z","lastTransitionTime":"2026-02-02T10:39:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.093548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.093652 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.093666 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.093684 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.093725 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.197098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.197173 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.197193 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.197221 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.197240 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.300999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.301062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.301079 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.301104 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.301123 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.404929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.405098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.405189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.405248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.405273 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.508906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.508958 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.508975 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.509001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.509020 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.570113 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570319 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:39:57.570286857 +0000 UTC m=+84.588626963 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.570383 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.570452 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.570529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.570593 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570635 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570731 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570750 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570877 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570893 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570759 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:57.570724557 +0000 UTC m=+84.589064683 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570948 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:57.570927912 +0000 UTC m=+84.589268018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570758 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.571016 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.571047 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.570964 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:57.570956743 +0000 UTC m=+84.589296849 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.571184 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:57.571142157 +0000 UTC m=+84.589482313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.612579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.612630 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.612642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.612658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.612669 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.623110 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:48:20.688682205 +0000 UTC Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.676892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.676953 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.676959 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.676910 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.677086 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.677322 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.677544 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.677684 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.715379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.715426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.715450 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.715466 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.715478 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.819755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.819800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.819809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.819825 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.819845 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.925602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.925656 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.925683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.925706 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.925723 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:25Z","lastTransitionTime":"2026-02-02T10:39:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:25 crc kubenswrapper[4901]: I0202 10:39:25.975530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.978901 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:25 crc kubenswrapper[4901]: E0202 10:39:25.979078 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:39:41.979045679 +0000 UTC m=+68.997385835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.028425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.028498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.028516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.028542 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.028607 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.131814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.131890 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.131906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.131931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.131950 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.235168 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.235242 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.235262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.235292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.235312 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.337981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.338047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.338071 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.338098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.338120 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.443632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.443702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.443713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.443737 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.443749 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.547529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.547621 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.547639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.547664 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.547681 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.624256 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 05:16:14.571112212 +0000 UTC Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.651011 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.651078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.651095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.651122 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.651140 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.754312 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.754361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.754373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.754391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.754404 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.857903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.857957 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.857975 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.858000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.858017 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.961616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.961690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.961715 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.961744 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:26 crc kubenswrapper[4901]: I0202 10:39:26.961783 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:26Z","lastTransitionTime":"2026-02-02T10:39:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.066679 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.066737 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.066754 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.066781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.066798 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.169798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.169867 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.169880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.169897 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.169910 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.272634 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.272723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.272747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.272788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.272812 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.379352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.379398 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.379409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.379424 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.379436 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.482195 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.482255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.482279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.482299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.482312 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.584442 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.584496 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.584510 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.584529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.584543 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.625301 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:40:23.711720218 +0000 UTC Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.675879 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.675957 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:27 crc kubenswrapper[4901]: E0202 10:39:27.676023 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.675878 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.675905 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:27 crc kubenswrapper[4901]: E0202 10:39:27.676110 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:27 crc kubenswrapper[4901]: E0202 10:39:27.676252 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:27 crc kubenswrapper[4901]: E0202 10:39:27.676334 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.686490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.686553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.686598 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.686621 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.686637 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.790166 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.790231 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.790248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.790313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.790331 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.893132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.893243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.893258 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.893299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.894173 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.996558 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.996612 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.996622 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.996637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:27 crc kubenswrapper[4901]: I0202 10:39:27.996646 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:27Z","lastTransitionTime":"2026-02-02T10:39:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.099096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.099145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.099157 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.099174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.099186 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.202001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.202074 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.202092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.202124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.202144 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.304002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.304069 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.304080 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.304096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.304109 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.408340 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.408399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.408410 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.408428 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.408438 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.513076 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.513367 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.514154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.514376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.514684 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.618070 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.618132 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.618155 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.618178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.618195 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.626471 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:36:05.925380752 +0000 UTC Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.720817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.720857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.720870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.720887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.720897 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.823180 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.823224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.823233 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.823249 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.823259 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.926124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.926193 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.926214 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.926317 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:28 crc kubenswrapper[4901]: I0202 10:39:28.926338 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:28Z","lastTransitionTime":"2026-02-02T10:39:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.029280 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.029339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.029356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.029380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.029401 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.131490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.131528 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.131536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.131549 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.131572 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.235235 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.235266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.235274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.235287 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.235295 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.337857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.337947 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.337966 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.337991 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.338010 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.441347 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.441415 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.441437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.441472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.441497 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.544785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.544884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.544914 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.544944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.544963 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.626897 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 13:56:26.5472784 +0000 UTC Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.649108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.649524 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.649733 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.649924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.650062 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.676539 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.676659 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.676686 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.676707 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:29 crc kubenswrapper[4901]: E0202 10:39:29.676789 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:29 crc kubenswrapper[4901]: E0202 10:39:29.676982 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:29 crc kubenswrapper[4901]: E0202 10:39:29.677123 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:29 crc kubenswrapper[4901]: E0202 10:39:29.677260 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.755257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.755299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.755310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.755326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.755338 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.860348 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.860435 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.860458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.860489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.860509 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.963852 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.963928 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.963944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.963974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:29 crc kubenswrapper[4901]: I0202 10:39:29.963994 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:29Z","lastTransitionTime":"2026-02-02T10:39:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.067295 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.067353 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.067368 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.067388 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.067405 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.169932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.170003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.170020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.170050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.170068 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.273817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.273891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.273911 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.273939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.273958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.377690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.377765 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.377783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.377810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.377830 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.481001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.481075 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.481097 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.481127 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.481147 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.495325 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.501316 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.501368 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.501386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.501409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.501425 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.516049 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.521952 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.522009 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.522025 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.522051 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.522069 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.542995 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.549714 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.549777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.549792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.549813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.549826 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.604541 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.609825 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.609876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.609887 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.609904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.609915 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.622866 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:30 crc kubenswrapper[4901]: E0202 10:39:30.622994 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.624984 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.625017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.625026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.625043 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.625054 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.627359 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 10:53:18.587950467 +0000 UTC Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.728217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.728274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.728286 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.728302 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.728313 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.832409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.832494 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.832514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.832544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.833010 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.935854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.935973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.935992 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.936024 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:30 crc kubenswrapper[4901]: I0202 10:39:30.936044 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:30Z","lastTransitionTime":"2026-02-02T10:39:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.039252 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.039333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.039418 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.039507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.039537 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.142606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.142690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.142712 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.142741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.142761 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.247458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.247544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.247592 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.247630 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.247651 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.351150 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.351231 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.351253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.351284 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.351305 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.455305 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.455391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.455415 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.455447 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.455469 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.559687 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.559947 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.559969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.560003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.560026 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.628007 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:50:10.06990245 +0000 UTC Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.663447 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.663497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.663509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.663527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.663540 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.676061 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.676092 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.676273 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.676457 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:31 crc kubenswrapper[4901]: E0202 10:39:31.676445 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:31 crc kubenswrapper[4901]: E0202 10:39:31.676669 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:31 crc kubenswrapper[4901]: E0202 10:39:31.676751 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:31 crc kubenswrapper[4901]: E0202 10:39:31.677004 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.767774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.767854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.767869 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.767898 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.767916 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.871521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.871701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.871731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.871778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.871805 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.975250 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.975314 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.975328 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.975351 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:31 crc kubenswrapper[4901]: I0202 10:39:31.975365 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:31Z","lastTransitionTime":"2026-02-02T10:39:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.078014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.078088 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.078107 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.078133 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.078151 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.180408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.180471 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.180489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.180514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.180532 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.284555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.284655 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.284672 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.284699 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.284718 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.387954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.388001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.388015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.388033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.388045 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.491121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.491169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.491179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.491194 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.491204 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.594697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.594762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.594781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.594815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.594833 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.628461 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:50:41.770814367 +0000 UTC Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.697376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.697417 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.697426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.697441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.697452 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.800704 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.800755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.800768 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.800794 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.800807 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.904511 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.904651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.904692 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.904727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:32 crc kubenswrapper[4901]: I0202 10:39:32.904757 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:32Z","lastTransitionTime":"2026-02-02T10:39:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.007761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.007823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.007846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.007873 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.007900 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.111750 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.111814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.111830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.111854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.111872 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.215267 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.215335 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.215354 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.215379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.215397 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.318725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.318764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.318775 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.318789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.318850 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.422759 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.422809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.422822 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.422839 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.422852 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.526038 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.526215 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.526236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.526262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.526280 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.629403 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 10:16:41.52960591 +0000 UTC Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.630226 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.630301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.630323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.630353 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.630376 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.676264 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.676357 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.676404 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.676536 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:33 crc kubenswrapper[4901]: E0202 10:39:33.676519 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:33 crc kubenswrapper[4901]: E0202 10:39:33.676720 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:33 crc kubenswrapper[4901]: E0202 10:39:33.676782 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:33 crc kubenswrapper[4901]: E0202 10:39:33.676928 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.693924 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.720822 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.734255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.734313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.734330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.734352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.734368 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.741422 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.759593 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.779435 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.808144 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.826099 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.836327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.836357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.836365 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.836379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.836388 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.839606 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.854082 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.868544 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.882417 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.897650 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.910868 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.924759 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.937630 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.947668 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.947728 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.947744 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.947766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.947785 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:33Z","lastTransitionTime":"2026-02-02T10:39:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.952764 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:33 crc kubenswrapper[4901]: I0202 10:39:33.967893 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.050018 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.050097 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.050123 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.050153 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.050179 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.153904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.154392 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.154570 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.154799 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.154966 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.258639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.258704 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.258724 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.258747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.258766 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.362226 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.362285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.362305 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.362328 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.362347 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.465408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.465527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.465547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.465600 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.465619 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.569275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.569349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.569369 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.569399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.569424 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.630536 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 04:31:56.606485415 +0000 UTC Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.673018 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.673093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.673109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.673140 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.673157 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.776071 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.776158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.776177 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.776212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.776233 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.880752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.880804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.880818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.880840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.880858 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.983646 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.983680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.983688 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.983701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:34 crc kubenswrapper[4901]: I0202 10:39:34.983710 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:34Z","lastTransitionTime":"2026-02-02T10:39:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.086252 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.086300 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.086310 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.086326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.086335 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.189080 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.189145 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.189163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.189205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.189216 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.292685 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.292732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.292754 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.292791 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.292809 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.395878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.395961 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.395986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.396015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.396034 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.502324 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.502413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.502480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.502520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.502719 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.606749 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.606806 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.606906 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.606929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.606942 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.630966 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 18:24:27.284981733 +0000 UTC Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.676214 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:35 crc kubenswrapper[4901]: E0202 10:39:35.676418 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.676517 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:35 crc kubenswrapper[4901]: E0202 10:39:35.676617 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.676959 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:35 crc kubenswrapper[4901]: E0202 10:39:35.677022 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.677454 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:35 crc kubenswrapper[4901]: E0202 10:39:35.677503 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.710001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.710047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.710062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.710083 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.710095 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.812800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.812844 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.812853 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.812867 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.812876 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.915884 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.915915 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.915925 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.915937 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:35 crc kubenswrapper[4901]: I0202 10:39:35.915945 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:35Z","lastTransitionTime":"2026-02-02T10:39:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.019158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.019202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.019210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.019224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.019235 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.121618 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.121671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.121680 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.121694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.121703 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.225097 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.225156 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.225174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.225203 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.225223 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.329062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.329138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.329159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.329191 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.329217 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.433101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.433154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.433173 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.433199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.433217 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.537257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.537330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.537348 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.537380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.537400 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.631240 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:17:44.60657061 +0000 UTC Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.639995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.640023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.640033 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.640047 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.640057 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.742297 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.742329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.742337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.742349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.742357 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.844397 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.844426 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.844433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.844446 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.844456 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.947385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.947433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.947446 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.947461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:36 crc kubenswrapper[4901]: I0202 10:39:36.947471 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:36Z","lastTransitionTime":"2026-02-02T10:39:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.050493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.050532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.050540 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.050553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.050589 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.153743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.153789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.153801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.153815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.153826 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.256383 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.256431 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.256442 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.256458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.256470 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.374151 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.374205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.374223 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.374246 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.374266 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.476748 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.476811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.476834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.476912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.476931 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.580313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.580356 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.580366 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.580384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.580395 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.631535 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 03:28:35.420132497 +0000 UTC Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.676217 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:37 crc kubenswrapper[4901]: E0202 10:39:37.676391 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.677399 4901 scope.go:117] "RemoveContainer" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" Feb 02 10:39:37 crc kubenswrapper[4901]: E0202 10:39:37.677709 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.677876 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:37 crc kubenswrapper[4901]: E0202 10:39:37.677947 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.678100 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:37 crc kubenswrapper[4901]: E0202 10:39:37.678184 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.678983 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:37 crc kubenswrapper[4901]: E0202 10:39:37.679054 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.683135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.683210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.683227 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.683299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.683317 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.787242 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.787296 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.787318 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.787341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.787358 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.889734 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.889771 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.889798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.889812 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.889822 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.992862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.992921 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.992937 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.992959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:37 crc kubenswrapper[4901]: I0202 10:39:37.992975 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:37Z","lastTransitionTime":"2026-02-02T10:39:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.106745 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.106822 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.106882 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.106912 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.106938 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.209899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.209944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.209954 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.209969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.209981 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.312731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.312774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.312784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.312800 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.312811 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.415523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.415568 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.415593 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.415610 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.415622 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.518637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.518693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.518706 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.518723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.518733 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.622198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.622237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.622248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.622263 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.622273 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.632634 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:20:42.326555297 +0000 UTC Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.724499 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.724538 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.724547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.724564 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.724573 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.826789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.826830 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.826841 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.826857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.826873 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.929287 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.929338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.929347 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.929361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:38 crc kubenswrapper[4901]: I0202 10:39:38.929370 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:38Z","lastTransitionTime":"2026-02-02T10:39:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.031969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.032016 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.032027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.032044 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.032055 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.135198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.135229 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.135237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.135250 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.135257 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.237489 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.237523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.237531 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.237547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.237560 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.340751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.340801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.340815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.340834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.340846 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.443744 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.443808 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.443828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.443848 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.443862 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.546343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.546384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.546393 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.546406 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.546418 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.632774 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:25:19.373841353 +0000 UTC Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.648724 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.648776 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.648791 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.648813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.648824 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.675735 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.675794 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.675816 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.675906 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:39 crc kubenswrapper[4901]: E0202 10:39:39.675906 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:39 crc kubenswrapper[4901]: E0202 10:39:39.675996 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:39 crc kubenswrapper[4901]: E0202 10:39:39.676109 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:39 crc kubenswrapper[4901]: E0202 10:39:39.676203 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.750732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.750788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.750797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.750810 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.750819 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.853473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.853512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.853521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.853536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.853545 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.955750 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.955807 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.955819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.955834 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:39 crc kubenswrapper[4901]: I0202 10:39:39.955847 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:39Z","lastTransitionTime":"2026-02-02T10:39:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.057908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.057943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.057951 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.057967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.057977 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.159973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.160027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.160042 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.160060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.160071 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.262185 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.262245 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.262257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.262273 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.262283 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.365013 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.365048 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.365057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.365070 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.365079 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.468516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.468561 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.468582 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.468598 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.468608 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.570746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.570787 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.570799 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.570815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.570826 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.633687 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:59:44.666735335 +0000 UTC Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.675482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.675520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.675528 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.675542 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.675552 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.778159 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.778203 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.778212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.778227 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.778236 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.877614 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.877663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.877673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.877690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.877700 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.889239 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.892950 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.893019 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.893038 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.893064 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.893083 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.905132 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.909648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.909701 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.909713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.909733 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.909744 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.920673 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.926479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.926523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.926532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.926549 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.926578 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.937210 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.941463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.941501 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.941514 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.941532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.941546 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.956484 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:40 crc kubenswrapper[4901]: E0202 10:39:40.956798 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.958658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.958809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.958840 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.958888 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:40 crc kubenswrapper[4901]: I0202 10:39:40.958912 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:40Z","lastTransitionTime":"2026-02-02T10:39:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.061191 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.061274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.061295 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.061326 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.061351 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.163498 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.163532 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.163541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.163556 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.163594 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.265660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.265777 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.265796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.265820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.265839 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.368020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.368066 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.368077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.368093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.368104 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.470883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.470944 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.470965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.470994 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.471016 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.573500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.573553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.573599 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.573624 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.573641 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.634771 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:54:57.351043015 +0000 UTC Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675521 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675563 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675589 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675618 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675640 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675663 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675677 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.675708 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.675738 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.675831 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.675942 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.676012 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.777870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.777918 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.777928 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.777942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.777954 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.880757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.880801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.880811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.880829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.880840 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983469 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983478 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983502 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:41Z","lastTransitionTime":"2026-02-02T10:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:41 crc kubenswrapper[4901]: I0202 10:39:41.983921 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.984038 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:41 crc kubenswrapper[4901]: E0202 10:39:41.984086 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:40:13.984073755 +0000 UTC m=+101.002413851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.085675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.085714 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.085723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.085739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.085748 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.187327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.187413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.187425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.187444 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.187457 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.290378 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.290417 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.290425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.290441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.290450 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.393762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.393846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.393863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.393886 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.393904 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.496751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.496782 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.496792 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.496819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.496830 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.599464 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.599543 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.599555 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.599587 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.599598 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.635820 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 07:11:48.36272221 +0000 UTC Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.701215 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.701246 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.701253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.701266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.701274 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.803904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.803938 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.803948 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.803963 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.803972 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.906292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.906339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.906348 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.906363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:42 crc kubenswrapper[4901]: I0202 10:39:42.906375 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:42Z","lastTransitionTime":"2026-02-02T10:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.008698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.008735 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.008747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.008761 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.008771 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.110961 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.111001 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.111010 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.111027 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.111039 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.213165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.213253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.213282 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.213320 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.213346 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.316969 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.317030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.317046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.317072 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.317089 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.420438 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.420480 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.420491 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.420509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.420523 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.523754 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.523827 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.523851 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.523883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.523924 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.625688 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.625757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.625779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.625808 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.625831 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.636786 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 02:52:39.19549861 +0000 UTC Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.676277 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.676385 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.676414 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.676479 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:43 crc kubenswrapper[4901]: E0202 10:39:43.676626 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:43 crc kubenswrapper[4901]: E0202 10:39:43.676728 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:43 crc kubenswrapper[4901]: E0202 10:39:43.676801 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:43 crc kubenswrapper[4901]: E0202 10:39:43.676852 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.690695 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.702605 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.714380 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.724066 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.729396 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.729458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.729481 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.729507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.729526 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.741350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.755803 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.769639 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.782618 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.794360 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.806152 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.817359 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.829517 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.831823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.831849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.831857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.831870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.831879 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.842608 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.854112 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.865097 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.874733 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.882843 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.936324 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.936384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.936396 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.936411 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:43 crc kubenswrapper[4901]: I0202 10:39:43.936420 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:43Z","lastTransitionTime":"2026-02-02T10:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.039476 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.039540 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.039553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.039601 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.039613 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.141839 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.141919 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.141931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.141955 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.141968 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: E0202 10:39:44.163537 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19eb421a_49aa_4cde_ae5e_3aba70ee67f4.slice/crio-conmon-c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.175183 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/0.log" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.175236 4901 generic.go:334] "Generic (PLEG): container finished" podID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" containerID="c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f" exitCode=1 Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.175269 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerDied","Data":"c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.175627 4901 scope.go:117] "RemoveContainer" containerID="c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.210501 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.224021 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.238389 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.246432 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.246465 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.246483 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.246497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.246508 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.248596 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.260809 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.272236 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.285662 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.295083 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.306719 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.317886 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.330264 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.343354 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.350733 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.350773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.350784 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.350818 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.350833 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.354516 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.365870 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.378111 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.392084 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.413916 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.453289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.453346 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.453361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.453382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.453395 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.556432 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.556494 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.556506 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.556527 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.556540 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.637182 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 07:25:26.95321464 +0000 UTC Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.660050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.660100 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.660109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.660126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.660136 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.763934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.763978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.763988 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.764003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.764012 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.867002 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.867067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.867077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.867099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.867112 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.970354 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.970406 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.970416 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.970434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:44 crc kubenswrapper[4901]: I0202 10:39:44.970445 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:44Z","lastTransitionTime":"2026-02-02T10:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.073922 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.073965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.073977 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.073994 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.074007 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.176677 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.176723 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.176756 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.176776 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.176787 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.179642 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/0.log" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.179711 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerStarted","Data":"15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.196646 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.218262 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.231752 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.248074 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.260230 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.270320 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.279903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.279945 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.279957 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.279974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.279988 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.283058 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.296926 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.312519 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.327169 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.345415 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.360671 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.371674 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.383325 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.383391 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.383408 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.383431 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.383448 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.387326 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.400857 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.409860 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.419893 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.486734 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.486789 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.486802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.486825 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.486837 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.589463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.589523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.589543 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.589597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.589616 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.637366 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:20:38.280685717 +0000 UTC Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.676180 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.676225 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.676249 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.676307 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:45 crc kubenswrapper[4901]: E0202 10:39:45.676490 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:45 crc kubenswrapper[4901]: E0202 10:39:45.676760 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:45 crc kubenswrapper[4901]: E0202 10:39:45.676873 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:45 crc kubenswrapper[4901]: E0202 10:39:45.676943 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.691915 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.692034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.692057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.692078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.692095 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.795216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.795261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.795275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.795292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.795303 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.901896 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.901971 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.902008 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.902040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:45 crc kubenswrapper[4901]: I0202 10:39:45.902061 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:45Z","lastTransitionTime":"2026-02-02T10:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.004892 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.004929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.004939 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.004953 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.004964 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.107201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.107236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.107246 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.107261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.107271 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.209622 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.209660 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.209670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.209684 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.209695 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.312526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.312642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.312694 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.312721 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.312735 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.415816 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.415858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.415870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.415891 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.415903 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.518488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.518576 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.518588 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.518603 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.518615 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.620610 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.620645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.620658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.620674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.620685 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.637592 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:26:25.531946798 +0000 UTC Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.723528 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.723588 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.723601 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.723619 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.723633 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.825842 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.825920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.825941 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.825970 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.826009 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.929169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.929245 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.929268 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.929297 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:46 crc kubenswrapper[4901]: I0202 10:39:46.929317 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:46Z","lastTransitionTime":"2026-02-02T10:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.031865 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.031899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.031907 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.031922 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.031931 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.137491 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.137557 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.137614 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.137639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.137657 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.240067 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.240120 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.240131 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.240148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.240158 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.343158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.343201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.343209 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.343225 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.343238 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.446863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.446932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.446952 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.446984 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.447005 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.550149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.550207 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.550217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.550232 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.550243 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.638280 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:41:12.935978613 +0000 UTC Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.652224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.652261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.652272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.652285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.652293 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.675994 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.676064 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:47 crc kubenswrapper[4901]: E0202 10:39:47.676143 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.676165 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.676222 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:47 crc kubenswrapper[4901]: E0202 10:39:47.676192 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:47 crc kubenswrapper[4901]: E0202 10:39:47.676345 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:47 crc kubenswrapper[4901]: E0202 10:39:47.676439 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.754910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.754949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.754957 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.754973 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.754982 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.857537 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.857579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.857587 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.857602 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.857610 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.963402 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.963441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.963449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.963463 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:47 crc kubenswrapper[4901]: I0202 10:39:47.963476 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:47Z","lastTransitionTime":"2026-02-02T10:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.066231 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.066274 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.066289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.066307 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.066319 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.168727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.168783 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.168799 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.168822 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.168838 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.270683 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.270729 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.270740 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.270760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.270770 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.376905 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.377182 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.377253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.377318 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.377409 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.481146 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.481201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.481218 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.481241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.481258 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.583524 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.583579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.583592 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.583609 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.583623 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.639420 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 14:48:51.897146075 +0000 UTC Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.686105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.686156 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.686172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.686190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.686203 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.789086 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.789147 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.789192 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.789216 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.789233 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.891866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.891933 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.891946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.891964 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.891998 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.993790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.994169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.994278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.994388 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:48 crc kubenswrapper[4901]: I0202 10:39:48.994493 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:48Z","lastTransitionTime":"2026-02-02T10:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.097848 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.097904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.097915 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.097929 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.097939 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.200149 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.200189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.200197 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.200212 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.200221 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.302702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.302742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.302754 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.302771 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.302784 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.405760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.405828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.405849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.405876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.405894 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.508616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.508684 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.508709 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.508742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.508769 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.611648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.612065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.612143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.612218 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.612426 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.640295 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:34:04.302500634 +0000 UTC Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.676062 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.676124 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:49 crc kubenswrapper[4901]: E0202 10:39:49.676229 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.676245 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:49 crc kubenswrapper[4901]: E0202 10:39:49.676303 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.676388 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:49 crc kubenswrapper[4901]: E0202 10:39:49.676463 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:49 crc kubenswrapper[4901]: E0202 10:39:49.676547 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.722513 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.722942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.723026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.723121 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.723179 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.826020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.826081 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.826098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.826123 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.826140 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.929722 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.929786 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.929796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.929814 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:49 crc kubenswrapper[4901]: I0202 10:39:49.929829 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:49Z","lastTransitionTime":"2026-02-02T10:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.033087 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.033157 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.033179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.033213 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.033236 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.137247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.137294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.137305 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.137323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.137337 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.240050 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.240096 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.240112 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.240127 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.240138 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.343865 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.343909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.343926 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.343949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.343966 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.447847 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.447932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.447946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.447965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.447981 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.551516 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.551626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.551645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.551672 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.551693 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.640681 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 04:05:07.151714494 +0000 UTC Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.654743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.654790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.654803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.654820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.654834 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.757282 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.757344 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.757370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.757384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.757394 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.860692 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.860769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.860787 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.860817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.860839 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.964608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.964659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.964674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.964698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:50 crc kubenswrapper[4901]: I0202 10:39:50.964713 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:50Z","lastTransitionTime":"2026-02-02T10:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.074122 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.074176 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.074189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.074211 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.074225 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.177889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.177965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.177985 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.178015 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.178037 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.206286 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.206371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.206395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.206424 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.206448 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.227601 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.233470 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.233523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.233538 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.233559 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.233932 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.253447 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.263675 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.263747 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.263760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.263781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.263796 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.283840 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.289226 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.289301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.289327 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.289362 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.289383 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.307771 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.312746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.312801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.312815 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.312837 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.312855 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.333255 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.333491 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.336487 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.336619 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.336644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.336676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.336699 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.439238 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.439321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.439337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.439361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.439381 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.542154 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.542207 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.542222 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.542248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.542267 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.641720 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:09:08.414549464 +0000 UTC Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.645102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.645163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.645210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.645234 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.645253 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.676659 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.676665 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.676728 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.676871 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.677115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.677213 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.677784 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:51 crc kubenswrapper[4901]: E0202 10:39:51.677927 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.679266 4901 scope.go:117] "RemoveContainer" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.696069 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.748842 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.748908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.748926 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.748953 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.748973 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.852048 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.852178 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.852201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.852224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.852243 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.955030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.955110 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.955130 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.955157 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:51 crc kubenswrapper[4901]: I0202 10:39:51.955175 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:51Z","lastTransitionTime":"2026-02-02T10:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.059337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.059385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.059395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.059413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.059424 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.161148 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.161184 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.161192 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.161205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.161213 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.211178 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/2.log" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.213927 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.214666 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.225994 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.241928 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.254761 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.263804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.263850 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.263862 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.263880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.263892 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.267928 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.286729 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.305414 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.320250 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.331295 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.342903 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.353350 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.363250 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.365678 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.365727 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.365741 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.365762 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.365775 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.374926 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.388204 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.396906 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c014cf8-918f-4192-a79d-388de98e9d1b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a7e50f1526f6acb88dc5fb3d0145fb2bc8a5a6d778aff7eb4d41c0cf920db9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.408928 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.421404 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.432801 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.442479 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:52Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.468059 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.468092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.468101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.468117 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.468127 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.570698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.570738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.570746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.570768 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.570780 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.642935 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 09:40:19.90095143 +0000 UTC Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.673851 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.673920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.673942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.673972 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.673994 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.777285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.777336 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.777353 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.777372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.777386 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.880649 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.880717 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.880738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.880765 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.880785 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.982875 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.982923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.982933 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.982949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:52 crc kubenswrapper[4901]: I0202 10:39:52.982958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:52Z","lastTransitionTime":"2026-02-02T10:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.085492 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.085542 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.085553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.085595 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.085607 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.188377 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.188427 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.188437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.188451 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.188461 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.218195 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/3.log" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.218749 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/2.log" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.221312 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" exitCode=1 Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.221354 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.221401 4901 scope.go:117] "RemoveContainer" containerID="67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.222042 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:39:53 crc kubenswrapper[4901]: E0202 10:39:53.222217 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.238990 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.250976 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.261992 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c014cf8-918f-4192-a79d-388de98e9d1b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a7e50f1526f6acb88dc5fb3d0145fb2bc8a5a6d778aff7eb4d41c0cf920db9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.274291 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.286972 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.290508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.290539 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.290551 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.290591 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.290603 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.299367 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.316025 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.329746 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.344060 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.357871 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.375076 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.388082 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.396546 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.396640 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.396651 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.396668 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.396681 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.404244 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.423616 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.435997 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.448026 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.463075 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.493330 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:52Z\\\",\\\"message\\\":\\\"rvices.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0202 10:39:52.508728 6916 services_controller.go:444] Built service openshift-network-console/networking-console-plugin LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508737 6916 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508751 6916 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0202 10:39:52.508774 6916 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 1.133207ms\\\\nF0202 10:39:52.508775 6916 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.499493 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.499600 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.499620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.499642 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.499660 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.603392 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.603632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.603662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.603693 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.603717 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.643278 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:55:45.30459585 +0000 UTC Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.676389 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.676442 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:53 crc kubenswrapper[4901]: E0202 10:39:53.676555 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:53 crc kubenswrapper[4901]: E0202 10:39:53.676648 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.676824 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:53 crc kubenswrapper[4901]: E0202 10:39:53.677183 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.677235 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:53 crc kubenswrapper[4901]: E0202 10:39:53.677465 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.692873 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706125 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706609 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706623 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.706634 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.717775 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.726925 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.737642 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.748577 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.759170 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c014cf8-918f-4192-a79d-388de98e9d1b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a7e50f1526f6acb88dc5fb3d0145fb2bc8a5a6d778aff7eb4d41c0cf920db9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.770956 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.786123 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.796371 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808338 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808888 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.808922 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.821065 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.833044 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.848294 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.861375 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.871965 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.892181 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.910738 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d8349fde38f0e6a381c1f42da08869dfdcec8bc8a35b7f44b49a2a80aa1a2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:21Z\\\",\\\"message\\\":\\\"9103\\\\\\\"\\\\nF0202 10:39:21.720419 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:21Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:39:21.713537 6520 services_controller.go:434] Service openshift-network-diagnostics/network-check-target retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{network-check-target openshift-network-diagnostics 3e2ce0c7-84ea-44e4-bf4a-d2f8388134f5 2812 0 2025-02-23 05:21:38 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[] map[] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc0075872d7 0xc0075872d8}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 8080 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: network-check-target,},Clust\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:52Z\\\",\\\"message\\\":\\\"rvices.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0202 10:39:52.508728 6916 services_controller.go:444] Built service openshift-network-console/networking-console-plugin LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508737 6916 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508751 6916 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0202 10:39:52.508774 6916 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 1.133207ms\\\\nF0202 10:39:52.508775 6916 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.912385 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.912421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.912434 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.912450 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:53 crc kubenswrapper[4901]: I0202 10:39:53.912462 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:53Z","lastTransitionTime":"2026-02-02T10:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.015458 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.015785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.015797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.015811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.015822 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.118681 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.118720 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.118731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.118749 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.118762 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.221014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.221062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.221072 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.221092 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.221103 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.224333 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/3.log" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.226912 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:39:54 crc kubenswrapper[4901]: E0202 10:39:54.227068 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.237189 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mcs8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8281e736-98f0-4282-b362-b55fd3d2810f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cdfc1733f1758eb8435b80dd92fe579c318124c1a99c1611ef2d66300eca26b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pwxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mcs8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.247781 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a1308eb-aead-41f7-b2b1-c92d80970304\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0202 10:38:47.369602 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:38:47.370901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-430650264/tls.crt::/tmp/serving-cert-430650264/tls.key\\\\\\\"\\\\nI0202 10:38:53.040764 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:38:53.043210 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:38:53.043233 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:38:53.043255 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:38:53.043260 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:38:53.047991 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0202 10:38:53.048006 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 10:38:53.048019 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048026 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:38:53.048033 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:38:53.048038 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:38:53.048042 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:38:53.048046 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 10:38:53.050356 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.257320 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"edf90eee-a04d-48ad-957f-8710016e2388\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3416be5e186cc0595234f36c8a59e1c2edadb59a8abe4121b274ea7dea70815\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac57745af3759e45361b7c59e94701e4b74dcbfa259eb2e8690578d5b2fa601\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6c9912bb4f435743eaadda76c0ef0fbc6492b740836407b274a89801ec5082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.265732 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7569b03b-2368-42f8-9da7-bbe72f5a21c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2828a61512e86689adb0a4e4f64eec20ab53ca87cdb2cf6c17e39a93879d9890\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1b664a2819886c06964f67db41f98a7dac531113f4263eaa8162c09a7d65f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c40acb59180d4010eb77503b43609af90e6bde8e3348a6475bf5133ceff68cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aaea00fb2c2d213c6ecf68803b14451f10a2d8cc7c345ef9286face6e17a967c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.277997 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-flw48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afbdc53a-f67a-44f7-a5bb-f446fb3706fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a172847249ecfe8e0ac94127f58173e760dfe5210cc51a12ca593b2882b46e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://075d2704b6d7dd98139440efba72ef224850b0f4b1efebcd15a7434b12d3c5f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8bd0da43d028883ec293528eea53a0887eb4de7afd4cb0c5c11d9c864a9500f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcf0e6854ca574689dd667fce1f6e7b00431540f812fe7ca393ac2354adf8944\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c5d38b4000b7653e8f8882b1c6001322dc46e5a2ee564b7588dd32c5b0cddd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b21bf79d85bc432d4240c25e5bee2bfc6c4825dc940802c9f702bb613d84c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://259b32378409bcf80bdc0f23c68ca44a2371771efac931bb4c96acaf2640078b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:39:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnls4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-flw48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.293955 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3390481-846a-4742-9eae-0796b667897f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:52Z\\\",\\\"message\\\":\\\"rvices.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0202 10:39:52.508728 6916 services_controller.go:444] Built service openshift-network-console/networking-console-plugin LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508737 6916 services_controller.go:445] Built service openshift-network-console/networking-console-plugin LB template configs for network=default: []services.lbConfig(nil)\\\\nI0202 10:39:52.508751 6916 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0202 10:39:52.508774 6916 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 1.133207ms\\\\nF0202 10:39:52.508775 6916 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:39:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwwwn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vm8h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.304891 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.315649 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.323283 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.323330 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.323341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.323358 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.323370 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.326982 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43352b63ae7151aacd58a3134e927aa7494db450772763086a8b783025a3e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.340739 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b226219c2a55a743db37b0dd98c110a54a0d2a1b2602aa122c7d9418198fb9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.353691 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"756c113d-5d5e-424e-bdf5-494b7774def6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8607482cdf03bfd2b59dcac8300e55292f3bb5b3661f11eddc30a5face19194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jllc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f29d8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.367749 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.381149 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.394141 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c014cf8-918f-4192-a79d-388de98e9d1b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a7e50f1526f6acb88dc5fb3d0145fb2bc8a5a6d778aff7eb4d41c0cf920db9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dc67c5ad754da6f48d400f53cfd5e03ba021533ecca6d637253325024904e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:38:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.405811 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.421493 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.425420 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.425460 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.425473 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.425490 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.425502 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.435315 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5fc3f6e-bf32-4388-86d0-b50b22a6ee2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dac970814cc06ebc0edf155553dce7d9579fd22f74f5174407d29850a07a4c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ba40e4419775afd21720866484ec29442dc564c1901d4ebc697b23c283c0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhwqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s8l22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.447732 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b96d903e-a64c-4321-8963-482d4b579e30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qtz59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:39:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fmjwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:39:54Z is after 2025-08-24T17:21:41Z" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.527873 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.527921 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.527932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.527948 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.527958 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.629880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.629920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.629932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.629949 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.629962 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.644354 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:14:11.722160032 +0000 UTC Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.732755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.732813 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.732829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.732852 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.732873 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.835202 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.835267 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.835285 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.835309 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.835327 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.938716 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.938773 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.938790 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.938819 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:54 crc kubenswrapper[4901]: I0202 10:39:54.938837 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:54Z","lastTransitionTime":"2026-02-02T10:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.042034 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.042075 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.042085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.042100 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.042111 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.145445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.145536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.145607 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.145639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.145658 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.260865 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.260924 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.260943 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.260966 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.260980 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.364637 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.364686 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.364698 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.364722 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.364736 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.468199 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.468243 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.468255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.468276 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.468289 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.571305 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.571373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.571400 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.571418 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.571429 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.644710 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 09:29:35.126127018 +0000 UTC Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.674437 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.674505 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.674523 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.674552 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.674605 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.675691 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:55 crc kubenswrapper[4901]: E0202 10:39:55.675853 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.675909 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.676000 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.676068 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:55 crc kubenswrapper[4901]: E0202 10:39:55.676216 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:55 crc kubenswrapper[4901]: E0202 10:39:55.676370 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:55 crc kubenswrapper[4901]: E0202 10:39:55.676673 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.777374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.777414 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.777425 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.777441 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.777451 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.880798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.880864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.880881 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.880903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.880917 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.989341 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.989402 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.989413 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.989429 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:55 crc kubenswrapper[4901]: I0202 10:39:55.989440 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:55Z","lastTransitionTime":"2026-02-02T10:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.093497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.093548 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.093579 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.093600 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.093615 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.197671 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.197782 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.197798 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.197824 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.197846 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.301102 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.301175 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.301195 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.301227 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.301252 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.405143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.405208 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.405227 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.405253 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.405272 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.508379 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.508453 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.508472 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.508500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.508520 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.611526 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.611615 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.611634 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.611663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.611689 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.645278 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 17:31:43.176984434 +0000 UTC Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.716302 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.716384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.716400 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.716423 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.716443 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.820157 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.820217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.820238 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.820265 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.820284 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.923956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.924046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.924064 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.924093 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:56 crc kubenswrapper[4901]: I0202 10:39:56.924112 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:56Z","lastTransitionTime":"2026-02-02T10:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.027065 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.027125 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.027143 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.027168 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.027188 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.130520 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.130632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.130648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.130673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.130693 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.233386 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.233452 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.233470 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.233497 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.233514 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.336418 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.336469 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.336482 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.336500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.336512 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.438901 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.438953 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.438965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.438982 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.438991 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.542281 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.542337 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.542349 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.542372 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.542386 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.644728 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.644769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.644780 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.644796 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.644807 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.646114 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 01:56:00.971741463 +0000 UTC Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.662225 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662413 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.662383097 +0000 UTC m=+148.680723193 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.662468 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.662530 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.662596 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.662636 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662728 4901 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662762 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662773 4901 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662788 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662809 4901 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662750 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662895 4901 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662923 4901 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.662849 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.662825327 +0000 UTC m=+148.681165433 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.663029 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.663000371 +0000 UTC m=+148.681340517 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.663083 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.663061703 +0000 UTC m=+148.681402019 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.663128 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.663112004 +0000 UTC m=+148.681452290 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.676058 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.676098 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.676111 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.676072 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.676215 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.676370 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.676781 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:57 crc kubenswrapper[4901]: E0202 10:39:57.676982 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.693082 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.747833 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.747889 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.747899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.747919 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.747932 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.849937 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.850008 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.850017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.850030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.850040 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.952588 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.952648 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.952659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.952676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:57 crc kubenswrapper[4901]: I0202 10:39:57.952689 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:57Z","lastTransitionTime":"2026-02-02T10:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.055553 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.055620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.055629 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.055643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.055653 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.158880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.158950 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.158970 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.159000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.159018 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.261541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.261604 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.261613 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.261626 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.261638 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.364486 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.364542 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.364597 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.364623 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.364639 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.467255 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.467321 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.467339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.467361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.467376 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.570168 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.570247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.570269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.570299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.570321 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.646857 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 04:13:10.625611156 +0000 UTC Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.673138 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.673187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.673201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.673220 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.673234 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.776462 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.776545 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.776613 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.776645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.776660 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.879811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.879866 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.879880 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.879902 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.879918 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.982886 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.982946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.982959 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.982978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:58 crc kubenswrapper[4901]: I0202 10:39:58.982992 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:58Z","lastTransitionTime":"2026-02-02T10:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.086907 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.086972 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.086995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.087024 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.087043 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.190095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.190161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.190180 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.190204 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.190223 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.292339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.292419 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.292436 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.292462 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.292486 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.395198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.395269 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.395289 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.395318 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.395338 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.498383 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.498435 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.498448 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.498468 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.498480 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.601967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.602012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.602022 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.602038 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.602048 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.647689 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:12:28.803706061 +0000 UTC Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.677453 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.677552 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.677602 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:39:59 crc kubenswrapper[4901]: E0202 10:39:59.678864 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.679542 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:39:59 crc kubenswrapper[4901]: E0202 10:39:59.680098 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:39:59 crc kubenswrapper[4901]: E0202 10:39:59.680211 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:39:59 crc kubenswrapper[4901]: E0202 10:39:59.683904 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.704971 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.705030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.705043 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.705062 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.705073 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.807951 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.807999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.808012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.808031 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.808044 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.911141 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.911181 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.911193 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.911218 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:39:59 crc kubenswrapper[4901]: I0202 10:39:59.911231 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:39:59Z","lastTransitionTime":"2026-02-02T10:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.014300 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.014662 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.014673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.014690 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.014701 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.117877 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.117908 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.117917 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.117930 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.117939 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.221895 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.221966 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.221985 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.222012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.222032 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.324978 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.325044 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.325059 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.325078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.325089 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.427935 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.428020 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.428044 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.428074 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.428096 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.530960 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.531009 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.531023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.531042 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.531056 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.633900 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.633965 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.633974 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.633990 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.634000 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.648116 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 14:04:34.025964174 +0000 UTC Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.736685 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.736732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.736743 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.736758 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.736768 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.840257 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.840313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.840334 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.840361 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.840380 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.943931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.944012 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.944037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.944071 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:00 crc kubenswrapper[4901]: I0202 10:40:00.944100 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:00Z","lastTransitionTime":"2026-02-02T10:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.047187 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.047248 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.047266 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.047292 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.047331 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.150760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.150820 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.150838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.150863 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.150885 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.253352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.253422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.253449 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.253479 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.253504 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.357639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.357778 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.357803 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.357835 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.357851 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.460769 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.460854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.460878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.460910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.460974 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.564272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.564329 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.564339 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.564355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.564366 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.649093 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 03:52:12.484039778 +0000 UTC Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.667739 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.667811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.667838 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.667870 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.667891 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.676044 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.676115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.676232 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.676066 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.676343 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.676427 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.676627 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.676776 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.720923 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.720986 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.721000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.721022 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.721035 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.750092 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.756123 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.756205 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.756223 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.756275 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.756294 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.778644 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.784163 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.784230 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.784252 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.784278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.784298 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.808782 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.813299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.813342 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.813355 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.813373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.813386 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.827446 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.832105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.832156 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.832169 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.832186 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.832198 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.847875 4901 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:40:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f285d71e-6d99-440a-8549-56e6a3710e3e\\\",\\\"systemUUID\\\":\\\"9f2b45ef-ead6-4cce-86c8-26b6d26ee095\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:01 crc kubenswrapper[4901]: E0202 10:40:01.848033 4901 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.850031 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.850068 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.850079 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.850095 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.850103 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.952573 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.952645 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.952658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.952676 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:01 crc kubenswrapper[4901]: I0202 10:40:01.952689 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:01Z","lastTransitionTime":"2026-02-02T10:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.055791 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.055902 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.055922 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.055982 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.056001 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.158109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.158161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.158172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.158190 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.158200 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.261848 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.262030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.262060 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.262219 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.262254 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.364874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.364963 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.364976 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.365028 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.365044 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.468725 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.468788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.468804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.468828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.468845 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.573760 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.573823 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.573836 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.573858 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.573878 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.650062 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 00:04:40.317922813 +0000 UTC Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.676103 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.676160 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.676174 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.676189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.676202 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.779495 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.779530 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.779557 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.779589 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.779599 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.882315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.882366 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.882380 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.882399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.882415 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.986323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.986486 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.986512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.986544 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:02 crc kubenswrapper[4901]: I0202 10:40:02.986600 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:02Z","lastTransitionTime":"2026-02-02T10:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.088801 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.088883 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.088894 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.088909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.088920 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.191658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.191708 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.191719 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.191736 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.191747 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.295058 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.295099 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.295109 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.295124 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.295135 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.398030 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.398069 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.398077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.398091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.398103 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.504477 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.504670 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.504695 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.504722 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.504790 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.607829 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.607895 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.607910 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.607931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.607943 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.651157 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 05:25:51.774417402 +0000 UTC Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.676652 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.676914 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.676968 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.676973 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:03 crc kubenswrapper[4901]: E0202 10:40:03.677095 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:03 crc kubenswrapper[4901]: E0202 10:40:03.677240 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:03 crc kubenswrapper[4901]: E0202 10:40:03.677510 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:03 crc kubenswrapper[4901]: E0202 10:40:03.677635 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.695221 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb836b18e6cb07844e8dfb32c28798089fc4fbc5b7f89ac3c1d595c8dd6de6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa3b8a5db87f56250b50a5c703beff1e6b11e10f1e11271ea4c7363b262d52d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.711882 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.713003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.713097 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.713116 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.713172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.713247 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.730419 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xj56" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e34f1db7-7f2a-4252-af0b-49fa172495f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4eefab588023c2d4a4d15786ba41c1b66a121eeef18edda4bdcc02d162975b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:38:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xj56\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.746387 4901 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5q92h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19eb421a-49aa-4cde-ae5e-3aba70ee67f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:38:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:39:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:39:44Z\\\",\\\"message\\\":\\\"2026-02-02T10:38:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137\\\\n2026-02-02T10:38:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ca2adcc6-a994-4d3c-9ca3-e7ba68683137 to /host/opt/cni/bin/\\\\n2026-02-02T10:38:59Z [verbose] multus-daemon started\\\\n2026-02-02T10:38:59Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:39:44Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:38:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:39:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7q2pl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:38:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5q92h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:40:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.798910 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.79887769 podStartE2EDuration="12.79887769s" podCreationTimestamp="2026-02-02 10:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.778451069 +0000 UTC m=+90.796791235" watchObservedRunningTime="2026-02-02 10:40:03.79887769 +0000 UTC m=+90.817217826" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.816170 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.816219 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.816236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.816261 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.816280 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.879706 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podStartSLOduration=69.879681769 podStartE2EDuration="1m9.879681769s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.879627938 +0000 UTC m=+90.897968094" watchObservedRunningTime="2026-02-02 10:40:03.879681769 +0000 UTC m=+90.898021865" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.899231 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s8l22" podStartSLOduration=68.89919887 podStartE2EDuration="1m8.89919887s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.89878822 +0000 UTC m=+90.917128326" watchObservedRunningTime="2026-02-02 10:40:03.89919887 +0000 UTC m=+90.917539006" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.919206 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.919262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.919279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.919301 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.919316 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:03Z","lastTransitionTime":"2026-02-02T10:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.938180 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.93815131 podStartE2EDuration="1m10.93815131s" podCreationTimestamp="2026-02-02 10:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.937445113 +0000 UTC m=+90.955785289" watchObservedRunningTime="2026-02-02 10:40:03.93815131 +0000 UTC m=+90.956491436" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.972000 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.971966728 podStartE2EDuration="1m10.971966728s" podCreationTimestamp="2026-02-02 10:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.954191958 +0000 UTC m=+90.972532064" watchObservedRunningTime="2026-02-02 10:40:03.971966728 +0000 UTC m=+90.990306834" Feb 02 10:40:03 crc kubenswrapper[4901]: I0202 10:40:03.985664 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.985646071 podStartE2EDuration="41.985646071s" podCreationTimestamp="2026-02-02 10:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.971115448 +0000 UTC m=+90.989455554" watchObservedRunningTime="2026-02-02 10:40:03.985646071 +0000 UTC m=+91.003986177" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.023343 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.023407 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.023429 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.023456 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.023479 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.024825 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.024800056 podStartE2EDuration="7.024800056s" podCreationTimestamp="2026-02-02 10:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:04.023747271 +0000 UTC m=+91.042087377" watchObservedRunningTime="2026-02-02 10:40:04.024800056 +0000 UTC m=+91.043140162" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.025539 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mcs8s" podStartSLOduration=70.025533083 podStartE2EDuration="1m10.025533083s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:03.985870916 +0000 UTC m=+91.004211052" watchObservedRunningTime="2026-02-02 10:40:04.025533083 +0000 UTC m=+91.043873189" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.084880 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-flw48" podStartSLOduration=70.084846214 podStartE2EDuration="1m10.084846214s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:04.053183086 +0000 UTC m=+91.071523212" watchObservedRunningTime="2026-02-02 10:40:04.084846214 +0000 UTC m=+91.103186320" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.126421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.126496 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.126512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.126536 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.126551 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.230534 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.230733 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.230757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.230781 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.230794 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.334882 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.335135 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.335161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.335194 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.335220 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.438158 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.438208 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.438220 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.438237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.438250 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.542461 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.542534 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.542554 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.542628 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.542653 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.645774 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.645854 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.645874 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.645899 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.645918 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.652227 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:18:21.998650875 +0000 UTC Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.749299 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.749402 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.749421 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.749445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.749463 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.852371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.852445 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.852474 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.852509 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.852533 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.955399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.955802 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.955904 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.956000 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:04 crc kubenswrapper[4901]: I0202 10:40:04.956095 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:04Z","lastTransitionTime":"2026-02-02T10:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.058488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.058606 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.058627 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.058658 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.058678 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.161893 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.161976 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.162003 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.162037 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.162059 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.265503 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.265621 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.265636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.265655 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.265670 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.367902 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.368279 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.368409 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.368547 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.368730 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.472116 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.472164 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.472172 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.472189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.472201 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.575376 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.575828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.575946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.576064 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.577026 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.652810 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 18:10:07.706799436 +0000 UTC Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.676228 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.676266 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.676364 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.676361 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:05 crc kubenswrapper[4901]: E0202 10:40:05.677353 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.677886 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:40:05 crc kubenswrapper[4901]: E0202 10:40:05.678180 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:40:05 crc kubenswrapper[4901]: E0202 10:40:05.678233 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:05 crc kubenswrapper[4901]: E0202 10:40:05.678372 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:05 crc kubenswrapper[4901]: E0202 10:40:05.677873 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.681422 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.681487 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.681503 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.682073 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.682128 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.786925 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.786968 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.786979 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.786996 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.787009 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.890673 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.890746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.890768 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.890799 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.890819 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.994624 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.994699 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.994713 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.994738 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:05 crc kubenswrapper[4901]: I0202 10:40:05.994753 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:05Z","lastTransitionTime":"2026-02-02T10:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.098294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.098357 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.098370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.098399 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.098413 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.201845 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.201911 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.201932 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.201963 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.201980 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.305235 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.305323 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.305344 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.305370 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.305389 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.409284 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.409352 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.409373 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.409401 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.409421 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.513835 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.514165 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.514282 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.514393 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.514490 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.618920 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.618977 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.619014 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.619183 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.619268 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.653877 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 01:34:07.135231712 +0000 UTC Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.723106 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.723191 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.723213 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.723241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.723261 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.827797 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.827952 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.828040 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.828127 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.828159 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.932433 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.932487 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.932508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.932533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:06 crc kubenswrapper[4901]: I0202 10:40:06.932555 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:06Z","lastTransitionTime":"2026-02-02T10:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.035999 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.036058 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.036077 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.036101 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.036119 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.139876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.139942 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.139956 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.139981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.139998 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.242183 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.242233 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.242247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.242268 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.242281 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.344981 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.345042 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.345057 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.345080 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.345095 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.448787 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.448833 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.448843 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.448861 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.448871 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.551435 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.551534 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.551599 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.551644 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.551675 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.654080 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 23:18:34.994501037 +0000 UTC Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.655843 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.655909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.655927 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.655995 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.656012 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.676749 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.676783 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.676889 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.676969 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:07 crc kubenswrapper[4901]: E0202 10:40:07.677044 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:07 crc kubenswrapper[4901]: E0202 10:40:07.677240 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:07 crc kubenswrapper[4901]: E0202 10:40:07.677373 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:07 crc kubenswrapper[4901]: E0202 10:40:07.677534 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.759529 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.759611 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.759632 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.759655 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.759668 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.863576 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.863636 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.863649 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.863674 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.863687 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.966751 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.966811 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.966828 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.966856 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:07 crc kubenswrapper[4901]: I0202 10:40:07.966875 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:07Z","lastTransitionTime":"2026-02-02T10:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.070668 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.070731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.070746 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.070772 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.070793 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.173395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.173471 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.173488 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.173517 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.173536 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.288967 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.289064 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.289098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.289131 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.289153 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.391752 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.391824 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.391846 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.391876 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.391899 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.494608 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.494712 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.494735 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.494763 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.494780 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.598663 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.598742 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.598766 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.598804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.598873 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.654689 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:29:57.181302749 +0000 UTC Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.702133 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.702229 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.702262 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.702294 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.702316 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.805616 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.805702 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.805731 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.805764 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.805790 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.908755 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.908845 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.908878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.908909 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:08 crc kubenswrapper[4901]: I0202 10:40:08.908930 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:08Z","lastTransitionTime":"2026-02-02T10:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.012631 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.012696 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.012726 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.012757 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.012780 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.116098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.116162 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.116179 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.116210 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.116228 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.220697 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.220785 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.220817 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.220864 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.220895 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.324508 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.324655 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.324682 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.324721 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.324829 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.427447 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.427533 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.427550 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.427788 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.427807 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.532842 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.533098 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.533128 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.533206 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.533226 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.637017 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.637078 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.637091 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.637110 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.637123 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.655909 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 18:05:57.45753215 +0000 UTC Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.676059 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.676178 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.676214 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.676109 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:09 crc kubenswrapper[4901]: E0202 10:40:09.676325 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:09 crc kubenswrapper[4901]: E0202 10:40:09.676495 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:09 crc kubenswrapper[4901]: E0202 10:40:09.676699 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:09 crc kubenswrapper[4901]: E0202 10:40:09.676770 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.740313 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.740364 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.740374 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.740390 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.740400 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.844094 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.844167 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.844189 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.844224 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.844244 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.947643 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.947692 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.947708 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.947732 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:09 crc kubenswrapper[4901]: I0202 10:40:09.947749 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:09Z","lastTransitionTime":"2026-02-02T10:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.051085 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.051152 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.051170 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.051198 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.051224 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.154241 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.154286 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.154298 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.154315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.154325 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.258201 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.258236 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.258245 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.258260 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.258271 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.361082 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.361117 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.361126 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.361140 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.361201 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.463507 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.463844 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.464008 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.464115 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.464189 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.568659 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.568779 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.568804 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.568849 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.568875 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.656766 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 09:30:57.481174291 +0000 UTC Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.672934 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.673045 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.673072 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.673108 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.673144 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.777046 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.777105 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.777116 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.777139 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.777154 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.880512 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.880574 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.880586 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.880607 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.880618 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.983382 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.983439 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.983454 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.983476 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:10 crc kubenswrapper[4901]: I0202 10:40:10.983494 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:10Z","lastTransitionTime":"2026-02-02T10:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.087117 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.087161 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.087170 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.087185 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.087194 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.190500 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.190541 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.190620 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.190639 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.190652 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.293772 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.293843 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.293857 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.293878 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.293890 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.396809 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.396867 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.396882 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.396903 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.396920 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.500217 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.500338 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.500359 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.500384 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.500403 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.604931 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.605006 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.605026 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.605055 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.605072 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.658124 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:45:53.175065957 +0000 UTC Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.675816 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.675817 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:11 crc kubenswrapper[4901]: E0202 10:40:11.675966 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.676022 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:11 crc kubenswrapper[4901]: E0202 10:40:11.676250 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:11 crc kubenswrapper[4901]: E0202 10:40:11.676338 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.676766 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:11 crc kubenswrapper[4901]: E0202 10:40:11.676897 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.708144 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.708229 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.708247 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.708318 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.708335 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.811278 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.811371 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.811395 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.811440 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.811463 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.915237 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.915315 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.915333 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.915363 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:11 crc kubenswrapper[4901]: I0202 10:40:11.915381 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:11Z","lastTransitionTime":"2026-02-02T10:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.000946 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.000994 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.001004 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.001023 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.001033 4901 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:40:12Z","lastTransitionTime":"2026-02-02T10:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.064344 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4"] Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.065402 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.069100 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.069132 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.069471 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.074766 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.136377 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.136473 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.136507 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.136527 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.136662 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.156753 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5q92h" podStartSLOduration=78.156715263 podStartE2EDuration="1m18.156715263s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:12.156082577 +0000 UTC m=+99.174422703" watchObservedRunningTime="2026-02-02 10:40:12.156715263 +0000 UTC m=+99.175055399" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.158808 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5xj56" podStartSLOduration=78.158796931 podStartE2EDuration="1m18.158796931s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:12.128900746 +0000 UTC m=+99.147240842" watchObservedRunningTime="2026-02-02 10:40:12.158796931 +0000 UTC m=+99.177137067" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238306 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238388 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238480 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238523 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238669 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.238816 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.239546 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.241555 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.246279 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.264132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d2a9cdd-ca85-4d67-8ea6-4396cde3062f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nr4z4\" (UID: \"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.397533 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.658372 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:03:39.397977722 +0000 UTC Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.658915 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 02 10:40:12 crc kubenswrapper[4901]: I0202 10:40:12.670464 4901 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.310343 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" event={"ID":"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f","Type":"ContainerStarted","Data":"4572b4477a2ee093e35f8d11f9fff67a03af6a888f91488e2c41f7af6c556064"} Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.310406 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" event={"ID":"8d2a9cdd-ca85-4d67-8ea6-4396cde3062f","Type":"ContainerStarted","Data":"f120afa21a100cddcba0691761f861f4414e2ed7677107d8f19ee1921b160f01"} Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.676655 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.676787 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.676615 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:13 crc kubenswrapper[4901]: I0202 10:40:13.677916 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:13 crc kubenswrapper[4901]: E0202 10:40:13.678366 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:13 crc kubenswrapper[4901]: E0202 10:40:13.678616 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:13 crc kubenswrapper[4901]: E0202 10:40:13.678929 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:13 crc kubenswrapper[4901]: E0202 10:40:13.679011 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:14 crc kubenswrapper[4901]: I0202 10:40:14.059289 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:14 crc kubenswrapper[4901]: E0202 10:40:14.059517 4901 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:40:14 crc kubenswrapper[4901]: E0202 10:40:14.059622 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs podName:b96d903e-a64c-4321-8963-482d4b579e30 nodeName:}" failed. No retries permitted until 2026-02-02 10:41:18.05960409 +0000 UTC m=+165.077944196 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs") pod "network-metrics-daemon-fmjwg" (UID: "b96d903e-a64c-4321-8963-482d4b579e30") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:40:15 crc kubenswrapper[4901]: I0202 10:40:15.675930 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:15 crc kubenswrapper[4901]: I0202 10:40:15.676032 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:15 crc kubenswrapper[4901]: E0202 10:40:15.676210 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:15 crc kubenswrapper[4901]: I0202 10:40:15.676301 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:15 crc kubenswrapper[4901]: I0202 10:40:15.676364 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:15 crc kubenswrapper[4901]: E0202 10:40:15.676667 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:15 crc kubenswrapper[4901]: E0202 10:40:15.677369 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:15 crc kubenswrapper[4901]: E0202 10:40:15.677442 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:16 crc kubenswrapper[4901]: I0202 10:40:16.677724 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:40:16 crc kubenswrapper[4901]: E0202 10:40:16.678161 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:40:17 crc kubenswrapper[4901]: I0202 10:40:17.675925 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:17 crc kubenswrapper[4901]: I0202 10:40:17.675968 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:17 crc kubenswrapper[4901]: I0202 10:40:17.676101 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:17 crc kubenswrapper[4901]: E0202 10:40:17.676315 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:17 crc kubenswrapper[4901]: E0202 10:40:17.676394 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:17 crc kubenswrapper[4901]: I0202 10:40:17.676407 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:17 crc kubenswrapper[4901]: E0202 10:40:17.676497 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:17 crc kubenswrapper[4901]: E0202 10:40:17.676619 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:19 crc kubenswrapper[4901]: I0202 10:40:19.676187 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:19 crc kubenswrapper[4901]: I0202 10:40:19.676236 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:19 crc kubenswrapper[4901]: I0202 10:40:19.676255 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:19 crc kubenswrapper[4901]: I0202 10:40:19.676253 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:19 crc kubenswrapper[4901]: E0202 10:40:19.676462 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:19 crc kubenswrapper[4901]: E0202 10:40:19.676688 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:19 crc kubenswrapper[4901]: E0202 10:40:19.676781 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:19 crc kubenswrapper[4901]: E0202 10:40:19.677104 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:21 crc kubenswrapper[4901]: I0202 10:40:21.675850 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:21 crc kubenswrapper[4901]: E0202 10:40:21.676060 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:21 crc kubenswrapper[4901]: I0202 10:40:21.676514 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:21 crc kubenswrapper[4901]: I0202 10:40:21.676677 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:21 crc kubenswrapper[4901]: E0202 10:40:21.676698 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:21 crc kubenswrapper[4901]: I0202 10:40:21.676954 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:21 crc kubenswrapper[4901]: E0202 10:40:21.677084 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:21 crc kubenswrapper[4901]: E0202 10:40:21.677356 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:23 crc kubenswrapper[4901]: I0202 10:40:23.675790 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:23 crc kubenswrapper[4901]: I0202 10:40:23.675805 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:23 crc kubenswrapper[4901]: I0202 10:40:23.675882 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:23 crc kubenswrapper[4901]: E0202 10:40:23.677914 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:23 crc kubenswrapper[4901]: I0202 10:40:23.678083 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:23 crc kubenswrapper[4901]: E0202 10:40:23.678097 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:23 crc kubenswrapper[4901]: E0202 10:40:23.678214 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:23 crc kubenswrapper[4901]: E0202 10:40:23.678394 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:25 crc kubenswrapper[4901]: I0202 10:40:25.676982 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:25 crc kubenswrapper[4901]: I0202 10:40:25.677104 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:25 crc kubenswrapper[4901]: I0202 10:40:25.677135 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:25 crc kubenswrapper[4901]: E0202 10:40:25.678019 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:25 crc kubenswrapper[4901]: E0202 10:40:25.678180 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:25 crc kubenswrapper[4901]: I0202 10:40:25.677182 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:25 crc kubenswrapper[4901]: E0202 10:40:25.678372 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:25 crc kubenswrapper[4901]: E0202 10:40:25.678969 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:27 crc kubenswrapper[4901]: I0202 10:40:27.676093 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:27 crc kubenswrapper[4901]: E0202 10:40:27.676258 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:27 crc kubenswrapper[4901]: I0202 10:40:27.676482 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:27 crc kubenswrapper[4901]: E0202 10:40:27.676545 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:27 crc kubenswrapper[4901]: I0202 10:40:27.677319 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:27 crc kubenswrapper[4901]: E0202 10:40:27.677506 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:27 crc kubenswrapper[4901]: I0202 10:40:27.677348 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:27 crc kubenswrapper[4901]: E0202 10:40:27.677718 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:29 crc kubenswrapper[4901]: I0202 10:40:29.677920 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:29 crc kubenswrapper[4901]: E0202 10:40:29.678292 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:29 crc kubenswrapper[4901]: I0202 10:40:29.679037 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:29 crc kubenswrapper[4901]: I0202 10:40:29.679095 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:29 crc kubenswrapper[4901]: I0202 10:40:29.679107 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:29 crc kubenswrapper[4901]: E0202 10:40:29.679746 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:29 crc kubenswrapper[4901]: E0202 10:40:29.679815 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:29 crc kubenswrapper[4901]: E0202 10:40:29.679831 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.380742 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/1.log" Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.381418 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/0.log" Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.381473 4901 generic.go:334] "Generic (PLEG): container finished" podID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" containerID="15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b" exitCode=1 Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.381512 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerDied","Data":"15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b"} Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.381590 4901 scope.go:117] "RemoveContainer" containerID="c29de6223de90bbfe98956d36ce20b9b57b319a5a825118f8eeabf13148e4c9f" Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.382337 4901 scope.go:117] "RemoveContainer" containerID="15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b" Feb 02 10:40:30 crc kubenswrapper[4901]: E0202 10:40:30.382712 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5q92h_openshift-multus(19eb421a-49aa-4cde-ae5e-3aba70ee67f4)\"" pod="openshift-multus/multus-5q92h" podUID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" Feb 02 10:40:30 crc kubenswrapper[4901]: I0202 10:40:30.407200 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nr4z4" podStartSLOduration=96.407164248 podStartE2EDuration="1m36.407164248s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:13.329960709 +0000 UTC m=+100.348300815" watchObservedRunningTime="2026-02-02 10:40:30.407164248 +0000 UTC m=+117.425504384" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.388405 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/1.log" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.676593 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.676742 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.676811 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.676902 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:31 crc kubenswrapper[4901]: E0202 10:40:31.676929 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:31 crc kubenswrapper[4901]: E0202 10:40:31.676964 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:31 crc kubenswrapper[4901]: E0202 10:40:31.677042 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:31 crc kubenswrapper[4901]: E0202 10:40:31.677171 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:31 crc kubenswrapper[4901]: I0202 10:40:31.678186 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:40:31 crc kubenswrapper[4901]: E0202 10:40:31.678398 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vm8h5_openshift-ovn-kubernetes(a3390481-846a-4742-9eae-0796b667897f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.657980 4901 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 02 10:40:33 crc kubenswrapper[4901]: I0202 10:40:33.676643 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:33 crc kubenswrapper[4901]: I0202 10:40:33.676722 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.679464 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:33 crc kubenswrapper[4901]: I0202 10:40:33.679627 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:33 crc kubenswrapper[4901]: I0202 10:40:33.679645 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.679830 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.680718 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.680803 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:33 crc kubenswrapper[4901]: E0202 10:40:33.814460 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:40:35 crc kubenswrapper[4901]: I0202 10:40:35.676305 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:35 crc kubenswrapper[4901]: I0202 10:40:35.676350 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:35 crc kubenswrapper[4901]: I0202 10:40:35.676349 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:35 crc kubenswrapper[4901]: I0202 10:40:35.676322 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:35 crc kubenswrapper[4901]: E0202 10:40:35.676548 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:35 crc kubenswrapper[4901]: E0202 10:40:35.676918 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:35 crc kubenswrapper[4901]: E0202 10:40:35.677068 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:35 crc kubenswrapper[4901]: E0202 10:40:35.677224 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:37 crc kubenswrapper[4901]: I0202 10:40:37.676220 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:37 crc kubenswrapper[4901]: I0202 10:40:37.676356 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:37 crc kubenswrapper[4901]: I0202 10:40:37.676357 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:37 crc kubenswrapper[4901]: E0202 10:40:37.676429 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:37 crc kubenswrapper[4901]: E0202 10:40:37.676657 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:37 crc kubenswrapper[4901]: I0202 10:40:37.676778 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:37 crc kubenswrapper[4901]: E0202 10:40:37.676829 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:37 crc kubenswrapper[4901]: E0202 10:40:37.677011 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:38 crc kubenswrapper[4901]: E0202 10:40:38.815713 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:40:39 crc kubenswrapper[4901]: I0202 10:40:39.676650 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:39 crc kubenswrapper[4901]: E0202 10:40:39.676814 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:39 crc kubenswrapper[4901]: I0202 10:40:39.676856 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:39 crc kubenswrapper[4901]: I0202 10:40:39.676897 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:39 crc kubenswrapper[4901]: E0202 10:40:39.676942 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:39 crc kubenswrapper[4901]: I0202 10:40:39.676650 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:39 crc kubenswrapper[4901]: E0202 10:40:39.677106 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:39 crc kubenswrapper[4901]: E0202 10:40:39.677347 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:41 crc kubenswrapper[4901]: I0202 10:40:41.675968 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:41 crc kubenswrapper[4901]: I0202 10:40:41.676074 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:41 crc kubenswrapper[4901]: I0202 10:40:41.676155 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:41 crc kubenswrapper[4901]: I0202 10:40:41.676025 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:41 crc kubenswrapper[4901]: E0202 10:40:41.676265 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:41 crc kubenswrapper[4901]: E0202 10:40:41.676365 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:41 crc kubenswrapper[4901]: E0202 10:40:41.676530 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:41 crc kubenswrapper[4901]: E0202 10:40:41.676688 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:43 crc kubenswrapper[4901]: I0202 10:40:43.676537 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:43 crc kubenswrapper[4901]: I0202 10:40:43.680435 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:43 crc kubenswrapper[4901]: I0202 10:40:43.680432 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:43 crc kubenswrapper[4901]: I0202 10:40:43.680525 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:43 crc kubenswrapper[4901]: E0202 10:40:43.680493 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:43 crc kubenswrapper[4901]: E0202 10:40:43.681339 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:43 crc kubenswrapper[4901]: E0202 10:40:43.681445 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:43 crc kubenswrapper[4901]: E0202 10:40:43.681551 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:43 crc kubenswrapper[4901]: E0202 10:40:43.816195 4901 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.676214 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.676320 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.676472 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.676518 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:45 crc kubenswrapper[4901]: E0202 10:40:45.676465 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:45 crc kubenswrapper[4901]: E0202 10:40:45.676834 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:45 crc kubenswrapper[4901]: E0202 10:40:45.677091 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.677516 4901 scope.go:117] "RemoveContainer" containerID="15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b" Feb 02 10:40:45 crc kubenswrapper[4901]: E0202 10:40:45.682112 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:45 crc kubenswrapper[4901]: I0202 10:40:45.682297 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.447636 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/3.log" Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.451025 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerStarted","Data":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.451464 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.453076 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/1.log" Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.453127 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerStarted","Data":"6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177"} Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.485191 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podStartSLOduration=111.485172791 podStartE2EDuration="1m51.485172791s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:46.484273849 +0000 UTC m=+133.502613945" watchObservedRunningTime="2026-02-02 10:40:46.485172791 +0000 UTC m=+133.503512877" Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.559845 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fmjwg"] Feb 02 10:40:46 crc kubenswrapper[4901]: I0202 10:40:46.560028 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:46 crc kubenswrapper[4901]: E0202 10:40:46.560156 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:47 crc kubenswrapper[4901]: I0202 10:40:47.675934 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:47 crc kubenswrapper[4901]: I0202 10:40:47.675991 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:47 crc kubenswrapper[4901]: I0202 10:40:47.675930 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:47 crc kubenswrapper[4901]: E0202 10:40:47.676114 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:40:47 crc kubenswrapper[4901]: I0202 10:40:47.675934 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:47 crc kubenswrapper[4901]: E0202 10:40:47.676378 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:40:47 crc kubenswrapper[4901]: E0202 10:40:47.676454 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fmjwg" podUID="b96d903e-a64c-4321-8963-482d4b579e30" Feb 02 10:40:47 crc kubenswrapper[4901]: E0202 10:40:47.676409 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.675943 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.676033 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.676036 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.676231 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.678786 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.679448 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.679539 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.679618 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.681152 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:40:49 crc kubenswrapper[4901]: I0202 10:40:49.683405 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.869272 4901 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.928183 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.929036 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f28zg"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.929759 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9hvh2"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.930801 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.931711 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.932535 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.935281 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.938633 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.939054 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.939065 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.939347 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.939925 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.945128 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.948522 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.948866 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966225 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-node-pullsecrets\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxfq\" (UniqueName: \"kubernetes.io/projected/89727a9a-3041-4169-b3b1-0d2840c585ff-kube-api-access-5jxfq\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966343 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-config\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966399 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966433 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-images\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966455 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-serving-cert\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966483 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt7vc\" (UniqueName: \"kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966508 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966595 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-audit\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966646 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe1fd46-e3a8-4729-9528-24f38fe69252-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-encryption-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966713 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966735 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966780 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-audit-dir\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966822 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-image-import-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966849 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966876 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-client\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966908 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qz6m\" (UniqueName: \"kubernetes.io/projected/3fe1fd46-e3a8-4729-9528-24f38fe69252-kube-api-access-7qz6m\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.966943 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.968365 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.968635 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jc78c"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969016 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969105 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969251 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969289 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969704 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969772 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.969987 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.970062 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.970446 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wl2tq"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.970718 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.970887 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.971005 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.971747 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.974614 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.975333 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.975656 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.976238 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.976402 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.976741 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.980925 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.982084 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.982334 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.982532 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.982683 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984015 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984347 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984470 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984580 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984747 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.984836 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985138 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985471 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985576 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985670 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985750 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.985848 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.986888 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.987024 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.987923 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.988157 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.988254 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.988445 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.988684 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.989859 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990035 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990075 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990251 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.988895 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990450 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990673 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990817 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990939 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.990971 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.991174 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.991315 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.991395 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.994314 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.994839 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.994858 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.997211 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-w875r"] Feb 02 10:40:52 crc kubenswrapper[4901]: I0202 10:40:52.999088 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.005048 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.005690 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.006092 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.006524 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.006928 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.007759 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.010081 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.015373 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.016014 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.024110 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.025923 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.026280 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.027239 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.027548 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.027612 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.028196 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.028734 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.029050 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.030333 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.029232 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.029292 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.029418 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.031712 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.029452 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.032769 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gns9g"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033112 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033267 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033397 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033398 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033533 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033590 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.033622 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.043532 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.043788 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.044066 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.044701 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.047521 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.048314 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pj9q5"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.048675 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jc78c"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.048697 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.048764 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.048881 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.055014 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.055881 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f28zg"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.056132 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.056636 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.056767 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.056890 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.056981 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.058909 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.059160 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.061487 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.062730 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.065874 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.066385 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.069208 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kw755"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.070005 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.072843 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.073218 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.075192 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.075386 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.075461 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.076581 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.077013 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-client\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.077048 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/658007dc-27de-4c38-b415-eb8eaa96d752-metrics-tls\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/63cddbc9-0581-4c69-8ecb-7ddec3907b21-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078068 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qz6m\" (UniqueName: \"kubernetes.io/projected/3fe1fd46-e3a8-4729-9528-24f38fe69252-kube-api-access-7qz6m\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078096 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078114 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-client\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078131 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-metrics-tls\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078147 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8zx\" (UniqueName: \"kubernetes.io/projected/63cddbc9-0581-4c69-8ecb-7ddec3907b21-kube-api-access-fd8zx\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078161 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnh2\" (UniqueName: \"kubernetes.io/projected/19791fe1-25f3-422d-9585-557e8d5a554c-kube-api-access-plnh2\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078175 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078191 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b29j\" (UniqueName: \"kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srxq\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-kube-api-access-2srxq\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078224 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078238 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078256 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-node-pullsecrets\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078273 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxfq\" (UniqueName: \"kubernetes.io/projected/89727a9a-3041-4169-b3b1-0d2840c585ff-kube-api-access-5jxfq\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078291 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz284\" (UniqueName: \"kubernetes.io/projected/bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3-kube-api-access-wz284\") pod \"downloads-7954f5f757-w875r\" (UID: \"bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3\") " pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078329 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rwp\" (UniqueName: \"kubernetes.io/projected/a8a59035-5d91-4d4f-970a-68bd142370dc-kube-api-access-l8rwp\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078343 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078361 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078376 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078393 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-config\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078409 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-config\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078423 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-trusted-ca\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078438 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-images\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078471 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-serving-cert\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078486 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19791fe1-25f3-422d-9585-557e8d5a554c-audit-dir\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078502 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078520 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078555 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078600 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-encryption-config\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078617 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt7vc\" (UniqueName: \"kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078636 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnwj\" (UniqueName: \"kubernetes.io/projected/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-kube-api-access-cgnwj\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078653 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/658007dc-27de-4c38-b415-eb8eaa96d752-kube-api-access-sqnf6\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078669 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a59035-5d91-4d4f-970a-68bd142370dc-serving-cert\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078709 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078725 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk84\" (UniqueName: \"kubernetes.io/projected/9077180d-c1ce-41d6-9569-a26bc79cce6c-kube-api-access-5jk84\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078742 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-audit\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078768 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe1fd46-e3a8-4729-9528-24f38fe69252-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-encryption-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078800 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-audit-policies\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078816 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078832 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078848 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078864 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63cddbc9-0581-4c69-8ecb-7ddec3907b21-serving-cert\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078879 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-serving-cert\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078894 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9077180d-c1ce-41d6-9569-a26bc79cce6c-serving-cert\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-config\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078935 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-audit-dir\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078949 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078965 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.078983 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.079005 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.079019 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.079036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-image-import-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.079050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.079065 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-trusted-ca\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.080267 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-node-pullsecrets\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.081100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.081904 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-config\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.082485 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.082517 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wl2tq"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.082529 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.083073 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.085215 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.085401 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.085537 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.085808 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.086037 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.086199 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.086489 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.086716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3fe1fd46-e3a8-4729-9528-24f38fe69252-images\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.087186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-audit\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.087519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89727a9a-3041-4169-b3b1-0d2840c585ff-audit-dir\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.088348 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-serving-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.088973 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.090716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-trusted-ca-bundle\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.092696 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/89727a9a-3041-4169-b3b1-0d2840c585ff-image-import-ca\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.093548 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fe1fd46-e3a8-4729-9528-24f38fe69252-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.097956 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.098416 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.099633 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.120909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-serving-cert\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.125150 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-encryption-config\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.125702 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.125766 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.125974 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.126744 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.127531 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.128069 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.128085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89727a9a-3041-4169-b3b1-0d2840c585ff-etcd-client\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.138779 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.139414 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.139535 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.141706 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.142521 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.144657 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9hvh2"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.147319 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.148017 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.150208 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.150808 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.151615 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.152233 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.153131 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.153781 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.154153 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.157591 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z5bfv"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.158216 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.162593 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.163210 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.165193 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6sw49"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.166421 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.166509 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.167515 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.168437 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.168681 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.169426 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.169860 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w875r"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.171369 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.172015 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.172423 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-79htn"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.173355 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.173508 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7nhqx"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.174061 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.174287 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.175014 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.175530 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.177811 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.178415 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180086 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz284\" (UniqueName: \"kubernetes.io/projected/bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3-kube-api-access-wz284\") pod \"downloads-7954f5f757-w875r\" (UID: \"bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3\") " pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180118 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180138 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rwp\" (UniqueName: \"kubernetes.io/projected/a8a59035-5d91-4d4f-970a-68bd142370dc-kube-api-access-l8rwp\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180286 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-config\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180302 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180341 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-trusted-ca\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180359 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19791fe1-25f3-422d-9585-557e8d5a554c-audit-dir\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180376 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180393 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180407 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180424 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-encryption-config\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180442 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/658007dc-27de-4c38-b415-eb8eaa96d752-kube-api-access-sqnf6\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180461 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a59035-5d91-4d4f-970a-68bd142370dc-serving-cert\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180476 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180498 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnwj\" (UniqueName: \"kubernetes.io/projected/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-kube-api-access-cgnwj\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk84\" (UniqueName: \"kubernetes.io/projected/9077180d-c1ce-41d6-9569-a26bc79cce6c-kube-api-access-5jk84\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180552 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-audit-policies\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180604 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63cddbc9-0581-4c69-8ecb-7ddec3907b21-serving-cert\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180619 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-serving-cert\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180635 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9077180d-c1ce-41d6-9569-a26bc79cce6c-serving-cert\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180661 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-config\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180692 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180710 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180728 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180744 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180766 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-trusted-ca\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180782 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/658007dc-27de-4c38-b415-eb8eaa96d752-metrics-tls\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180798 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/63cddbc9-0581-4c69-8ecb-7ddec3907b21-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180824 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-client\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-metrics-tls\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8zx\" (UniqueName: \"kubernetes.io/projected/63cddbc9-0581-4c69-8ecb-7ddec3907b21-kube-api-access-fd8zx\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180874 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnh2\" (UniqueName: \"kubernetes.io/projected/19791fe1-25f3-422d-9585-557e8d5a554c-kube-api-access-plnh2\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180891 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180907 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b29j\" (UniqueName: \"kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180922 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srxq\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-kube-api-access-2srxq\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180936 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.180950 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.182131 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.182152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/63cddbc9-0581-4c69-8ecb-7ddec3907b21-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.183845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-config\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.184170 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19791fe1-25f3-422d-9585-557e8d5a554c-audit-dir\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.184584 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.184837 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.185273 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-config\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.185824 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.185913 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.186128 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.186352 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.186414 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.186651 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19791fe1-25f3-422d-9585-557e8d5a554c-audit-policies\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.187462 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.188071 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-65x22"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.188698 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9077180d-c1ce-41d6-9569-a26bc79cce6c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.188826 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8a59035-5d91-4d4f-970a-68bd142370dc-trusted-ca\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.189435 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-encryption-config\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.189966 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63cddbc9-0581-4c69-8ecb-7ddec3907b21-serving-cert\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.190351 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-serving-cert\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.190508 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-glksr"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.191828 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.192735 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65x22" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.193228 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9077180d-c1ce-41d6-9569-a26bc79cce6c-serving-cert\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.193326 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.194348 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.194178 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.196529 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gns9g"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.196801 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.197374 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a59035-5d91-4d4f-970a-68bd142370dc-serving-cert\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.198223 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19791fe1-25f3-422d-9585-557e8d5a554c-etcd-client\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.198328 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.199381 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pj9q5"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.200506 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.201234 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.201840 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.203486 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.204780 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kw755"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.206107 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.207185 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.208114 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7nhqx"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.209188 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.212574 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.212766 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.213901 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.214385 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.215022 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.217438 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.218944 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.228235 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.230584 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-79htn"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.234014 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.234793 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f4k9c"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.236080 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.236634 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.238134 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.239778 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.241385 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z5bfv"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.242880 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.244403 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65x22"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.246138 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f4k9c"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.247720 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8glpd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.248663 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.249381 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8glpd"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.254817 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.274517 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.294865 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.315379 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.328924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/658007dc-27de-4c38-b415-eb8eaa96d752-metrics-tls\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.334913 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.354456 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.394364 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.414064 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.433976 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.439029 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.453629 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.455523 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.474252 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.494377 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.513422 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.533288 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.539761 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-metrics-tls\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.560388 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.562217 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-trusted-ca\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.613193 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qz6m\" (UniqueName: \"kubernetes.io/projected/3fe1fd46-e3a8-4729-9528-24f38fe69252-kube-api-access-7qz6m\") pod \"machine-api-operator-5694c8668f-f28zg\" (UID: \"3fe1fd46-e3a8-4729-9528-24f38fe69252\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.628121 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxfq\" (UniqueName: \"kubernetes.io/projected/89727a9a-3041-4169-b3b1-0d2840c585ff-kube-api-access-5jxfq\") pod \"apiserver-76f77b778f-9hvh2\" (UID: \"89727a9a-3041-4169-b3b1-0d2840c585ff\") " pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.634602 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.647103 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.654902 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.674585 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.694854 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.716690 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.734143 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.771308 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt7vc\" (UniqueName: \"kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc\") pod \"controller-manager-879f6c89f-b6s7r\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.776335 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.794678 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.814906 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.816473 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-f28zg"] Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.835814 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.856742 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.875346 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.885794 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.893938 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.905541 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.914542 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.934071 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.954634 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.975102 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:40:53 crc kubenswrapper[4901]: I0202 10:40:53.994535 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.017470 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.035128 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.077837 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.077841 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.094337 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.098331 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-9hvh2"] Feb 02 10:40:54 crc kubenswrapper[4901]: W0202 10:40:54.109917 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89727a9a_3041_4169_b3b1_0d2840c585ff.slice/crio-8838047901c494a34322fbad13e68bcaa7921aa071b6ca78f5a48280577c8735 WatchSource:0}: Error finding container 8838047901c494a34322fbad13e68bcaa7921aa071b6ca78f5a48280577c8735: Status 404 returned error can't find the container with id 8838047901c494a34322fbad13e68bcaa7921aa071b6ca78f5a48280577c8735 Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.116850 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.134623 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.151992 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.154849 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:40:54 crc kubenswrapper[4901]: W0202 10:40:54.159362 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4ce91ed_4f91_4a92_a83b_f9c6d45a81dc.slice/crio-9eac68ee16a6efbf3891c4330bd1776997a669b9e2b095b1bb6e7f60962ffc41 WatchSource:0}: Error finding container 9eac68ee16a6efbf3891c4330bd1776997a669b9e2b095b1bb6e7f60962ffc41: Status 404 returned error can't find the container with id 9eac68ee16a6efbf3891c4330bd1776997a669b9e2b095b1bb6e7f60962ffc41 Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.172833 4901 request.go:700] Waited for 1.014341393s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.174954 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.194033 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.213872 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.235675 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.254613 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.274154 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.294063 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.314386 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.334328 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.353912 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.374200 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.394483 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.414845 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.433464 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.454212 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.474162 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.484083 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" event={"ID":"3fe1fd46-e3a8-4729-9528-24f38fe69252","Type":"ContainerStarted","Data":"6485df1d0d9d928e277f73c0fd08207a59a4bc55b3fae639cdd239c7fc3889c6"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.484132 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" event={"ID":"3fe1fd46-e3a8-4729-9528-24f38fe69252","Type":"ContainerStarted","Data":"26c8f722c242e55b683e8d7e51b389fce1daf0d01e442fd79d9ae22180f97d64"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.484144 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" event={"ID":"3fe1fd46-e3a8-4729-9528-24f38fe69252","Type":"ContainerStarted","Data":"5c19774393ab8a5337c8fef8e36fd0202748bda4ed213743c6c784553635ebdc"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.485592 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" event={"ID":"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc","Type":"ContainerStarted","Data":"ef8420ea52ebc18dc7827ffa006f795a9b255185a1eb89a7606f2cab38d3fb00"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.485659 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" event={"ID":"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc","Type":"ContainerStarted","Data":"9eac68ee16a6efbf3891c4330bd1776997a669b9e2b095b1bb6e7f60962ffc41"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.485690 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.487296 4901 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b6s7r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.487338 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.487687 4901 generic.go:334] "Generic (PLEG): container finished" podID="89727a9a-3041-4169-b3b1-0d2840c585ff" containerID="30da268ba7ddc0db4afcc188d3f092a33766158b5891012afdb9cf0b6e07692a" exitCode=0 Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.487741 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" event={"ID":"89727a9a-3041-4169-b3b1-0d2840c585ff","Type":"ContainerDied","Data":"30da268ba7ddc0db4afcc188d3f092a33766158b5891012afdb9cf0b6e07692a"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.487773 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" event={"ID":"89727a9a-3041-4169-b3b1-0d2840c585ff","Type":"ContainerStarted","Data":"8838047901c494a34322fbad13e68bcaa7921aa071b6ca78f5a48280577c8735"} Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.494352 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.514905 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.534030 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.553904 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.574269 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.594401 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.614943 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.634147 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.653415 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.673732 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.694392 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.735148 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.738962 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.756955 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.774310 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.812166 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz284\" (UniqueName: \"kubernetes.io/projected/bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3-kube-api-access-wz284\") pod \"downloads-7954f5f757-w875r\" (UID: \"bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3\") " pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.827911 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.855686 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b29j\" (UniqueName: \"kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j\") pod \"console-f9d7485db-wb2m4\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.868707 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srxq\" (UniqueName: \"kubernetes.io/projected/84b9c922-cd12-451d-b4d2-9dbfcf4d422e-kube-api-access-2srxq\") pod \"ingress-operator-5b745b69d9-kw755\" (UID: \"84b9c922-cd12-451d-b4d2-9dbfcf4d422e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.890325 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk84\" (UniqueName: \"kubernetes.io/projected/9077180d-c1ce-41d6-9569-a26bc79cce6c-kube-api-access-5jk84\") pod \"authentication-operator-69f744f599-jc78c\" (UID: \"9077180d-c1ce-41d6-9569-a26bc79cce6c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.910329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnwj\" (UniqueName: \"kubernetes.io/projected/355febd0-19d6-472c-96a3-9f4a3eaa3bc5-kube-api-access-cgnwj\") pod \"openshift-controller-manager-operator-756b6f6bc6-jqqtr\" (UID: \"355febd0-19d6-472c-96a3-9f4a3eaa3bc5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.931147 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqnf6\" (UniqueName: \"kubernetes.io/projected/658007dc-27de-4c38-b415-eb8eaa96d752-kube-api-access-sqnf6\") pod \"dns-operator-744455d44c-gns9g\" (UID: \"658007dc-27de-4c38-b415-eb8eaa96d752\") " pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.937099 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.949512 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8zx\" (UniqueName: \"kubernetes.io/projected/63cddbc9-0581-4c69-8ecb-7ddec3907b21-kube-api-access-fd8zx\") pod \"openshift-config-operator-7777fb866f-2pdrw\" (UID: \"63cddbc9-0581-4c69-8ecb-7ddec3907b21\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.954348 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.970229 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rwp\" (UniqueName: \"kubernetes.io/projected/a8a59035-5d91-4d4f-970a-68bd142370dc-kube-api-access-l8rwp\") pod \"console-operator-58897d9998-wl2tq\" (UID: \"a8a59035-5d91-4d4f-970a-68bd142370dc\") " pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:54 crc kubenswrapper[4901]: I0202 10:40:54.990081 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a4a849-2a99-4268-87ab-9fdcb9a7055c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wcjmj\" (UID: \"f6a4a849-2a99-4268-87ab-9fdcb9a7055c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.009078 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnh2\" (UniqueName: \"kubernetes.io/projected/19791fe1-25f3-422d-9585-557e8d5a554c-kube-api-access-plnh2\") pod \"apiserver-7bbb656c7d-qsh9f\" (UID: \"19791fe1-25f3-422d-9585-557e8d5a554c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.012914 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.014190 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.034862 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.037296 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.050890 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.056253 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.072828 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.074854 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.079824 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.087824 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.098919 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.113963 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.146057 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.154936 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.173321 4901 request.go:700] Waited for 1.936990362s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.176373 4901 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.194658 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.214531 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.219244 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jc78c"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.234208 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.235625 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.247449 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.256514 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.274603 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.315170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.315630 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.315707 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-client\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.315744 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8daa3e41-cbbf-4739-8a26-4a62f3e10636-machine-approver-tls\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.315764 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317462 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-service-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317512 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317580 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjwws\" (UniqueName: \"kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317607 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317657 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317680 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317805 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1665435-f91f-43ca-84ff-dba7fb1c1198-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317827 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317845 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317891 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-auth-proxy-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317930 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317968 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxh7b\" (UniqueName: \"kubernetes.io/projected/713da16e-91d1-4bba-af10-4e9a06ef7c81-kube-api-access-xxh7b\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.317996 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318039 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-config\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318084 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26bee8d-8a92-4246-80ce-0b1b420b869d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318124 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318193 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.318222 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.318788 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:55.818761887 +0000 UTC m=+142.837102063 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319513 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319631 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1665435-f91f-43ca-84ff-dba7fb1c1198-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319657 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-serving-cert\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319760 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.319991 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.320059 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbqm\" (UniqueName: \"kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.320378 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4fwt\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-kube-api-access-p4fwt\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.320427 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.320947 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlh9p\" (UniqueName: \"kubernetes.io/projected/8daa3e41-cbbf-4739-8a26-4a62f3e10636-kube-api-access-qlh9p\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.320985 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq6zp\" (UniqueName: \"kubernetes.io/projected/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-kube-api-access-qq6zp\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321019 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghmf\" (UniqueName: \"kubernetes.io/projected/cf9e5479-7683-4748-b571-c7d6c64d149b-kube-api-access-8ghmf\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321039 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321058 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzrc\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321099 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a26bee8d-8a92-4246-80ce-0b1b420b869d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321164 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjfj\" (UniqueName: \"kubernetes.io/projected/a26bee8d-8a92-4246-80ce-0b1b420b869d-kube-api-access-psjfj\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321246 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321278 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/713da16e-91d1-4bba-af10-4e9a06ef7c81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.321311 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.417384 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423145 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423453 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzrc\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423489 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423525 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjfj\" (UniqueName: \"kubernetes.io/projected/a26bee8d-8a92-4246-80ce-0b1b420b869d-kube-api-access-psjfj\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423579 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc14a9d-2037-4a08-8640-ebedde61adfa-proxy-tls\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423619 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-plugins-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423655 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-srv-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6l9\" (UniqueName: \"kubernetes.io/projected/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-kube-api-access-wz6l9\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423769 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-metrics-tls\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423827 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a13cb9-6ad7-4585-b49e-ae77af983e38-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423844 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-certs\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423870 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423887 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423908 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a686a95-f3bb-4edb-aae8-86995516c3ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423926 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnh7m\" (UniqueName: \"kubernetes.io/projected/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-kube-api-access-vnh7m\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423946 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423967 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5524bd4-86b0-4d07-ae14-7d7fa7058955-serving-cert\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.423985 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv6mr\" (UniqueName: \"kubernetes.io/projected/1db2f103-d3aa-45f5-acd6-d70543968d36-kube-api-access-bv6mr\") pod \"migrator-59844c95c7-wkpqq\" (UID: \"1db2f103-d3aa-45f5-acd6-d70543968d36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424017 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424040 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z722n\" (UniqueName: \"kubernetes.io/projected/81e78715-5b78-44d5-b225-df11f642c082-kube-api-access-z722n\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424061 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjwws\" (UniqueName: \"kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424107 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-csi-data-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424136 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a686a95-f3bb-4edb-aae8-86995516c3ff-config\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424162 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-cabundle\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424191 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn2cv\" (UniqueName: \"kubernetes.io/projected/a4ab49c1-659f-466e-b480-a641b964ab2a-kube-api-access-cn2cv\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424243 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1665435-f91f-43ca-84ff-dba7fb1c1198-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424281 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424311 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-socket-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424347 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424366 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/929d7edd-8851-427e-a1bd-a8ddd6817e70-tmpfs\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424383 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-registration-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424417 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cc14a9d-2037-4a08-8640-ebedde61adfa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424434 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-metrics-certs\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424461 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5524bd4-86b0-4d07-ae14-7d7fa7058955-config\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424481 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424516 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-config-volume\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424544 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424577 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424628 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-serving-cert\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424649 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ssl\" (UniqueName: \"kubernetes.io/projected/d447069d-acf2-4316-ab5b-2d2692e8f1e6-kube-api-access-24ssl\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424665 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-key\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424722 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424767 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81e78715-5b78-44d5-b225-df11f642c082-proxy-tls\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424821 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghmf\" (UniqueName: \"kubernetes.io/projected/cf9e5479-7683-4748-b571-c7d6c64d149b-kube-api-access-8ghmf\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424857 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlh9p\" (UniqueName: \"kubernetes.io/projected/8daa3e41-cbbf-4739-8a26-4a62f3e10636-kube-api-access-qlh9p\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424875 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424892 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46pq\" (UniqueName: \"kubernetes.io/projected/9cc14a9d-2037-4a08-8640-ebedde61adfa-kube-api-access-f46pq\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424909 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spzt\" (UniqueName: \"kubernetes.io/projected/3b78e424-eccd-4efa-9b7c-b59ca43bef39-kube-api-access-5spzt\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424926 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a26bee8d-8a92-4246-80ce-0b1b420b869d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.424967 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-node-bootstrap-token\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425001 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425024 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/713da16e-91d1-4bba-af10-4e9a06ef7c81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425074 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqxhv\" (UniqueName: \"kubernetes.io/projected/d5524bd4-86b0-4d07-ae14-7d7fa7058955-kube-api-access-qqxhv\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425094 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-webhook-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425112 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khfc\" (UniqueName: \"kubernetes.io/projected/01a13cb9-6ad7-4585-b49e-ae77af983e38-kube-api-access-4khfc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425129 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-images\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425147 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425165 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-auth-proxy-config\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425182 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425201 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8daa3e41-cbbf-4739-8a26-4a62f3e10636-machine-approver-tls\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425217 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-client\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425234 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-cert\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425250 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rdrd\" (UniqueName: \"kubernetes.io/projected/929d7edd-8851-427e-a1bd-a8ddd6817e70-kube-api-access-2rdrd\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425266 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-service-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425289 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-stats-auth\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425322 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425342 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-apiservice-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425390 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-config\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425407 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcgx\" (UniqueName: \"kubernetes.io/projected/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-kube-api-access-6qcgx\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425424 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-srv-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425443 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-default-certificate\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425516 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr44s\" (UniqueName: \"kubernetes.io/projected/118f2636-94f9-40a7-90e7-d48df737a551-kube-api-access-hr44s\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425534 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvlg\" (UniqueName: \"kubernetes.io/projected/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-kube-api-access-ssvlg\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425554 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425593 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-auth-proxy-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425621 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a13cb9-6ad7-4585-b49e-ae77af983e38-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425666 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425685 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxh7b\" (UniqueName: \"kubernetes.io/projected/713da16e-91d1-4bba-af10-4e9a06ef7c81-kube-api-access-xxh7b\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425705 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m555\" (UniqueName: \"kubernetes.io/projected/882b18e8-9ca2-48b2-9f94-8ac94b54d508-kube-api-access-8m555\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425721 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425748 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-config\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425771 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6c7\" (UniqueName: \"kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425793 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425817 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4dbv\" (UniqueName: \"kubernetes.io/projected/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-kube-api-access-m4dbv\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425835 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26bee8d-8a92-4246-80ce-0b1b420b869d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425888 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425915 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425940 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1665435-f91f-43ca-84ff-dba7fb1c1198-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.425956 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a686a95-f3bb-4edb-aae8-86995516c3ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426041 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbqm\" (UniqueName: \"kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426071 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4fwt\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-kube-api-access-p4fwt\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426104 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d447069d-acf2-4316-ab5b-2d2692e8f1e6-service-ca-bundle\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426127 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcf7g\" (UniqueName: \"kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426179 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426218 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq6zp\" (UniqueName: \"kubernetes.io/projected/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-kube-api-access-qq6zp\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426240 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-mountpoint-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.426849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.427148 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:55.927125346 +0000 UTC m=+142.945465442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.427909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.429312 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.430347 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.430712 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.430952 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-service-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.431343 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a26bee8d-8a92-4246-80ce-0b1b420b869d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.432080 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.432100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-auth-proxy-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.433612 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8daa3e41-cbbf-4739-8a26-4a62f3e10636-config\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.435782 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.435897 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a1665435-f91f-43ca-84ff-dba7fb1c1198-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.437015 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.437972 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8daa3e41-cbbf-4739-8a26-4a62f3e10636-machine-approver-tls\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.438966 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.439101 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.439537 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-config\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.440012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.440011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.440917 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.441110 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.441327 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-client\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.443396 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.443777 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1665435-f91f-43ca-84ff-dba7fb1c1198-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.444113 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.444157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.444990 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.445523 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.445684 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cf9e5479-7683-4748-b571-c7d6c64d149b-etcd-ca\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.445691 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/713da16e-91d1-4bba-af10-4e9a06ef7c81-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.445947 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.450092 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.450216 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.452906 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a26bee8d-8a92-4246-80ce-0b1b420b869d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.452910 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf9e5479-7683-4748-b571-c7d6c64d149b-serving-cert\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.458038 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzrc\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.471179 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjfj\" (UniqueName: \"kubernetes.io/projected/a26bee8d-8a92-4246-80ce-0b1b420b869d-kube-api-access-psjfj\") pod \"openshift-apiserver-operator-796bbdcf4f-s4wkl\" (UID: \"a26bee8d-8a92-4246-80ce-0b1b420b869d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527222 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc14a9d-2037-4a08-8640-ebedde61adfa-proxy-tls\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527516 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-plugins-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527535 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-srv-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527579 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527599 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6l9\" (UniqueName: \"kubernetes.io/projected/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-kube-api-access-wz6l9\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527614 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-metrics-tls\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527631 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527659 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a13cb9-6ad7-4585-b49e-ae77af983e38-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527678 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-certs\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a686a95-f3bb-4edb-aae8-86995516c3ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnh7m\" (UniqueName: \"kubernetes.io/projected/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-kube-api-access-vnh7m\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527730 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5524bd4-86b0-4d07-ae14-7d7fa7058955-serving-cert\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527747 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv6mr\" (UniqueName: \"kubernetes.io/projected/1db2f103-d3aa-45f5-acd6-d70543968d36-kube-api-access-bv6mr\") pod \"migrator-59844c95c7-wkpqq\" (UID: \"1db2f103-d3aa-45f5-acd6-d70543968d36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527764 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527793 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z722n\" (UniqueName: \"kubernetes.io/projected/81e78715-5b78-44d5-b225-df11f642c082-kube-api-access-z722n\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527819 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527841 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-csi-data-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527868 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-plugins-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528512 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a686a95-f3bb-4edb-aae8-86995516c3ff-config\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.527884 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a686a95-f3bb-4edb-aae8-86995516c3ff-config\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn2cv\" (UniqueName: \"kubernetes.io/projected/a4ab49c1-659f-466e-b480-a641b964ab2a-kube-api-access-cn2cv\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-cabundle\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528630 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-socket-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/929d7edd-8851-427e-a1bd-a8ddd6817e70-tmpfs\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528682 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cc14a9d-2037-4a08-8640-ebedde61adfa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528698 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-registration-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528717 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528736 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-metrics-certs\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528755 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5524bd4-86b0-4d07-ae14-7d7fa7058955-config\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-config-volume\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-key\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ssl\" (UniqueName: \"kubernetes.io/projected/d447069d-acf2-4316-ab5b-2d2692e8f1e6-kube-api-access-24ssl\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528875 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81e78715-5b78-44d5-b225-df11f642c082-proxy-tls\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528918 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46pq\" (UniqueName: \"kubernetes.io/projected/9cc14a9d-2037-4a08-8640-ebedde61adfa-kube-api-access-f46pq\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528936 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spzt\" (UniqueName: \"kubernetes.io/projected/3b78e424-eccd-4efa-9b7c-b59ca43bef39-kube-api-access-5spzt\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528953 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-node-bootstrap-token\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.528989 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqxhv\" (UniqueName: \"kubernetes.io/projected/d5524bd4-86b0-4d07-ae14-7d7fa7058955-kube-api-access-qqxhv\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529006 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-webhook-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529021 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-images\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529027 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/929d7edd-8851-427e-a1bd-a8ddd6817e70-tmpfs\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529041 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khfc\" (UniqueName: \"kubernetes.io/projected/01a13cb9-6ad7-4585-b49e-ae77af983e38-kube-api-access-4khfc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529072 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-auth-proxy-config\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529096 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rdrd\" (UniqueName: \"kubernetes.io/projected/929d7edd-8851-427e-a1bd-a8ddd6817e70-kube-api-access-2rdrd\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529116 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-cert\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529135 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-stats-auth\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529155 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-config\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529172 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529189 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-apiservice-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcgx\" (UniqueName: \"kubernetes.io/projected/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-kube-api-access-6qcgx\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529227 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-srv-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529255 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-default-certificate\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr44s\" (UniqueName: \"kubernetes.io/projected/118f2636-94f9-40a7-90e7-d48df737a551-kube-api-access-hr44s\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529331 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvlg\" (UniqueName: \"kubernetes.io/projected/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-kube-api-access-ssvlg\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529363 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a13cb9-6ad7-4585-b49e-ae77af983e38-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529393 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m555\" (UniqueName: \"kubernetes.io/projected/882b18e8-9ca2-48b2-9f94-8ac94b54d508-kube-api-access-8m555\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529416 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529433 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6c7\" (UniqueName: \"kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529455 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529472 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4dbv\" (UniqueName: \"kubernetes.io/projected/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-kube-api-access-m4dbv\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529501 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a686a95-f3bb-4edb-aae8-86995516c3ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529534 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d447069d-acf2-4316-ab5b-2d2692e8f1e6-service-ca-bundle\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529552 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcf7g\" (UniqueName: \"kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529600 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-mountpoint-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.529618 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.534272 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-socket-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.534622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-registration-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.534845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.535272 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a13cb9-6ad7-4585-b49e-ae77af983e38-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.535375 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.535945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.539858 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-cabundle\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.545432 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cc14a9d-2037-4a08-8640-ebedde61adfa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.545799 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" event={"ID":"f6a4a849-2a99-4268-87ab-9fdcb9a7055c","Type":"ContainerStarted","Data":"0b4ee200a1c42e5c664a4404d310485aeb010f669bdd02ff648fd94d1f0d105c"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.548005 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbqm\" (UniqueName: \"kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm\") pod \"oauth-openshift-558db77b4-tsxn7\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.549484 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.049450494 +0000 UTC m=+143.067790590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.550958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-auth-proxy-config\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.553206 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc14a9d-2037-4a08-8640-ebedde61adfa-proxy-tls\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.553704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-config-volume\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.554445 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-apiservice-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.554487 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-mountpoint-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.555018 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5524bd4-86b0-4d07-ae14-7d7fa7058955-config\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.555724 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-config\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.556055 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d447069d-acf2-4316-ab5b-2d2692e8f1e6-service-ca-bundle\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.556297 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3b78e424-eccd-4efa-9b7c-b59ca43bef39-csi-data-dir\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.556475 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.557127 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-certs\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.558705 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-default-certificate\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.559247 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-cert\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.559999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-profile-collector-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.560500 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/882b18e8-9ca2-48b2-9f94-8ac94b54d508-node-bootstrap-token\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.561083 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" event={"ID":"9077180d-c1ce-41d6-9569-a26bc79cce6c","Type":"ContainerStarted","Data":"907a77aeacb7e6eb8e90230fcdb3f763584335cd734b7ad415952043d5923b5f"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.561126 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" event={"ID":"9077180d-c1ce-41d6-9569-a26bc79cce6c","Type":"ContainerStarted","Data":"556c079965e8277d1729033ffee716c6798535e706bc7b975bc3f5d357d5d636"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.562270 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-srv-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.566140 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/118f2636-94f9-40a7-90e7-d48df737a551-srv-cert\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.566602 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/929d7edd-8851-427e-a1bd-a8ddd6817e70-webhook-cert\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.567085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81e78715-5b78-44d5-b225-df11f642c082-images\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.567186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-metrics-tls\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.567268 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4fwt\" (UniqueName: \"kubernetes.io/projected/a1665435-f91f-43ca-84ff-dba7fb1c1198-kube-api-access-p4fwt\") pod \"cluster-image-registry-operator-dc59b4c8b-zmx6r\" (UID: \"a1665435-f91f-43ca-84ff-dba7fb1c1198\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.567638 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-metrics-certs\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.569263 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d447069d-acf2-4316-ab5b-2d2692e8f1e6-stats-auth\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.572788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-signing-key\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.573813 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.577110 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.577155 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.577406 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81e78715-5b78-44d5-b225-df11f642c082-proxy-tls\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.577495 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.577845 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a13cb9-6ad7-4585-b49e-ae77af983e38-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.581461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5524bd4-86b0-4d07-ae14-7d7fa7058955-serving-cert\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.582810 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" event={"ID":"63cddbc9-0581-4c69-8ecb-7ddec3907b21","Type":"ContainerStarted","Data":"117ade1e2008fb2e53fe189d63d7c94832ee8341b491b605b98d9dda203006e2"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.586199 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a686a95-f3bb-4edb-aae8-86995516c3ff-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.591301 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a4ab49c1-659f-466e-b480-a641b964ab2a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.594823 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" event={"ID":"89727a9a-3041-4169-b3b1-0d2840c585ff","Type":"ContainerStarted","Data":"88ee8025b881ce4ebea9cdf346033ca4b4912d7a46c836eaac765c93d962bb49"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.594879 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" event={"ID":"89727a9a-3041-4169-b3b1-0d2840c585ff","Type":"ContainerStarted","Data":"f4dc0ac40938e4128e0ab2c2b34faa05bdf7a71d4da7c24202d59eb806eca02d"} Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.597025 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.595603 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghmf\" (UniqueName: \"kubernetes.io/projected/cf9e5479-7683-4748-b571-c7d6c64d149b-kube-api-access-8ghmf\") pod \"etcd-operator-b45778765-pj9q5\" (UID: \"cf9e5479-7683-4748-b571-c7d6c64d149b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.601828 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq6zp\" (UniqueName: \"kubernetes.io/projected/a8271eaa-0795-430a-b3b3-bc8b3ffb53b7-kube-api-access-qq6zp\") pod \"cluster-samples-operator-665b6dd947-nv578\" (UID: \"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.603732 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.611971 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.613609 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.632797 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.638143 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxh7b\" (UniqueName: \"kubernetes.io/projected/713da16e-91d1-4bba-af10-4e9a06ef7c81-kube-api-access-xxh7b\") pod \"control-plane-machine-set-operator-78cbb6b69f-2qp9z\" (UID: \"713da16e-91d1-4bba-af10-4e9a06ef7c81\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.640372 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.140336661 +0000 UTC m=+143.158676757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.641066 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.641398 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.141388546 +0000 UTC m=+143.159728852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.644846 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.660114 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlh9p\" (UniqueName: \"kubernetes.io/projected/8daa3e41-cbbf-4739-8a26-4a62f3e10636-kube-api-access-qlh9p\") pod \"machine-approver-56656f9798-8sgkj\" (UID: \"8daa3e41-cbbf-4739-8a26-4a62f3e10636\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.679814 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.680076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjwws\" (UniqueName: \"kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws\") pod \"route-controller-manager-6576b87f9c-mpwqj\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.726316 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.726322 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khfc\" (UniqueName: \"kubernetes.io/projected/01a13cb9-6ad7-4585-b49e-ae77af983e38-kube-api-access-4khfc\") pod \"kube-storage-version-migrator-operator-b67b599dd-kvfzd\" (UID: \"01a13cb9-6ad7-4585-b49e-ae77af983e38\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.729223 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-w875r"] Feb 02 10:40:55 crc kubenswrapper[4901]: W0202 10:40:55.736036 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc9f0c4_e3f4_4d41_8304_1bcbbb6b67e3.slice/crio-1bee808448ff11ee8f72d16e86c32ce759e9a0be8ce4047c300518b4cc334b09 WatchSource:0}: Error finding container 1bee808448ff11ee8f72d16e86c32ce759e9a0be8ce4047c300518b4cc334b09: Status 404 returned error can't find the container with id 1bee808448ff11ee8f72d16e86c32ce759e9a0be8ce4047c300518b4cc334b09 Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.743523 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.744815 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.244789638 +0000 UTC m=+143.263129734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.766256 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.767928 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.773528 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wl2tq"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.775359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr44s\" (UniqueName: \"kubernetes.io/projected/118f2636-94f9-40a7-90e7-d48df737a551-kube-api-access-hr44s\") pod \"catalog-operator-68c6474976-wjt2l\" (UID: \"118f2636-94f9-40a7-90e7-d48df737a551\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.777504 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6c7\" (UniqueName: \"kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7\") pod \"marketplace-operator-79b997595-xnv4q\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.814107 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnh7m\" (UniqueName: \"kubernetes.io/projected/bc3c6078-4cbf-4ace-a91c-f48a6910f7c8-kube-api-access-vnh7m\") pod \"multus-admission-controller-857f4d67dd-z5bfv\" (UID: \"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.825953 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6l9\" (UniqueName: \"kubernetes.io/projected/bd8d2e50-5006-4cc8-bfaa-fdbdc245053b-kube-api-access-wz6l9\") pod \"ingress-canary-8glpd\" (UID: \"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b\") " pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.834667 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.842795 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ssl\" (UniqueName: \"kubernetes.io/projected/d447069d-acf2-4316-ab5b-2d2692e8f1e6-kube-api-access-24ssl\") pod \"router-default-5444994796-6sw49\" (UID: \"d447069d-acf2-4316-ab5b-2d2692e8f1e6\") " pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.846353 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.846745 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.346731075 +0000 UTC m=+143.365071171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.850215 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvlg\" (UniqueName: \"kubernetes.io/projected/bc898e00-9c24-4c1e-974b-cfc7d2b48bc7-kube-api-access-ssvlg\") pod \"dns-default-65x22\" (UID: \"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7\") " pod="openshift-dns/dns-default-65x22" Feb 02 10:40:55 crc kubenswrapper[4901]: W0202 10:40:55.853138 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a59035_5d91_4d4f_970a_68bd142370dc.slice/crio-1cacaac4a7f06461f7ee914d8f4a1aa0d5d7610691082986d9998a5903fcaeee WatchSource:0}: Error finding container 1cacaac4a7f06461f7ee914d8f4a1aa0d5d7610691082986d9998a5903fcaeee: Status 404 returned error can't find the container with id 1cacaac4a7f06461f7ee914d8f4a1aa0d5d7610691082986d9998a5903fcaeee Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.855900 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gns9g"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.867956 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.869400 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.869525 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kw755"] Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.891815 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn2cv\" (UniqueName: \"kubernetes.io/projected/a4ab49c1-659f-466e-b480-a641b964ab2a-kube-api-access-cn2cv\") pod \"olm-operator-6b444d44fb-5pfqc\" (UID: \"a4ab49c1-659f-466e-b480-a641b964ab2a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.893087 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46pq\" (UniqueName: \"kubernetes.io/projected/9cc14a9d-2037-4a08-8640-ebedde61adfa-kube-api-access-f46pq\") pod \"machine-config-controller-84d6567774-85dm9\" (UID: \"9cc14a9d-2037-4a08-8640-ebedde61adfa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:55 crc kubenswrapper[4901]: W0202 10:40:55.895526 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658007dc_27de_4c38_b415_eb8eaa96d752.slice/crio-6e0d84ad819c42309b9b758833de84f2a848859b45025460db4f06c0ed47d994 WatchSource:0}: Error finding container 6e0d84ad819c42309b9b758833de84f2a848859b45025460db4f06c0ed47d994: Status 404 returned error can't find the container with id 6e0d84ad819c42309b9b758833de84f2a848859b45025460db4f06c0ed47d994 Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.900081 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8glpd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.914732 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rdrd\" (UniqueName: \"kubernetes.io/projected/929d7edd-8851-427e-a1bd-a8ddd6817e70-kube-api-access-2rdrd\") pod \"packageserver-d55dfcdfc-8xlmd\" (UID: \"929d7edd-8851-427e-a1bd-a8ddd6817e70\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.923699 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcgx\" (UniqueName: \"kubernetes.io/projected/ff29d0d0-c1c4-4eb4-bf70-6210af819cb4-kube-api-access-6qcgx\") pod \"package-server-manager-789f6589d5-bmwhx\" (UID: \"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.933523 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.938990 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv6mr\" (UniqueName: \"kubernetes.io/projected/1db2f103-d3aa-45f5-acd6-d70543968d36-kube-api-access-bv6mr\") pod \"migrator-59844c95c7-wkpqq\" (UID: \"1db2f103-d3aa-45f5-acd6-d70543968d36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.947487 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:55 crc kubenswrapper[4901]: E0202 10:40:55.948013 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.447991586 +0000 UTC m=+143.466331682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.963945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z722n\" (UniqueName: \"kubernetes.io/projected/81e78715-5b78-44d5-b225-df11f642c082-kube-api-access-z722n\") pod \"machine-config-operator-74547568cd-klpkc\" (UID: \"81e78715-5b78-44d5-b225-df11f642c082\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:55 crc kubenswrapper[4901]: I0202 10:40:55.994248 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.003125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spzt\" (UniqueName: \"kubernetes.io/projected/3b78e424-eccd-4efa-9b7c-b59ca43bef39-kube-api-access-5spzt\") pod \"csi-hostpathplugin-f4k9c\" (UID: \"3b78e424-eccd-4efa-9b7c-b59ca43bef39\") " pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.008163 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ef3abb69-4ee9-487f-a1d3-197fe16c5fb0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-zt89j\" (UID: \"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.015886 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.018875 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.023801 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcf7g\" (UniqueName: \"kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g\") pod \"collect-profiles-29500470-46wx7\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.026207 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.030814 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.033198 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.046180 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.049833 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.050662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.051027 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.551012519 +0000 UTC m=+143.569352615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.054731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a686a95-f3bb-4edb-aae8-86995516c3ff-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-krwxc\" (UID: \"8a686a95-f3bb-4edb-aae8-86995516c3ff\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.066672 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.067335 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.071227 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqxhv\" (UniqueName: \"kubernetes.io/projected/d5524bd4-86b0-4d07-ae14-7d7fa7058955-kube-api-access-qqxhv\") pod \"service-ca-operator-777779d784-79htn\" (UID: \"d5524bd4-86b0-4d07-ae14-7d7fa7058955\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.076079 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.083993 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m555\" (UniqueName: \"kubernetes.io/projected/882b18e8-9ca2-48b2-9f94-8ac94b54d508-kube-api-access-8m555\") pod \"machine-config-server-glksr\" (UID: \"882b18e8-9ca2-48b2-9f94-8ac94b54d508\") " pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.084859 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.097490 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.105202 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.132026 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.137127 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4dbv\" (UniqueName: \"kubernetes.io/projected/00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6-kube-api-access-m4dbv\") pod \"service-ca-9c57cc56f-7nhqx\" (UID: \"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6\") " pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.141687 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65x22" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.152848 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-glksr" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.153817 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.154635 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.654537604 +0000 UTC m=+143.672877700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: W0202 10:40:56.177962 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26bee8d_8a92_4246_80ce_0b1b420b869d.slice/crio-65f933997e18ecadffca1398ba27acc8a7063a89bdf720a5608d027807a17da4 WatchSource:0}: Error finding container 65f933997e18ecadffca1398ba27acc8a7063a89bdf720a5608d027807a17da4: Status 404 returned error can't find the container with id 65f933997e18ecadffca1398ba27acc8a7063a89bdf720a5608d027807a17da4 Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.192470 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.263087 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.266902 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.763580649 +0000 UTC m=+143.781920745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.296402 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.391845 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.392859 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.892836192 +0000 UTC m=+143.911176298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.419247 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.454697 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" podStartSLOduration=121.454679832 podStartE2EDuration="2m1.454679832s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:56.435092609 +0000 UTC m=+143.453432715" watchObservedRunningTime="2026-02-02 10:40:56.454679832 +0000 UTC m=+143.473019928" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.495794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.496693 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:56.996678034 +0000 UTC m=+144.015018130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.544774 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.554874 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pj9q5"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.570088 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.602425 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.602936 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.102919323 +0000 UTC m=+144.121259419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.661344 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" event={"ID":"658007dc-27de-4c38-b415-eb8eaa96d752","Type":"ContainerStarted","Data":"6e0d84ad819c42309b9b758833de84f2a848859b45025460db4f06c0ed47d994"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.677116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" event={"ID":"84b9c922-cd12-451d-b4d2-9dbfcf4d422e","Type":"ContainerStarted","Data":"1946a19df8ed335034b1678cc3bbd19da295af2d736773fe06b0929fad89cb15"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.687389 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" event={"ID":"8daa3e41-cbbf-4739-8a26-4a62f3e10636","Type":"ContainerStarted","Data":"80b017bfd0574e3b3375ffc8fe49cb1c3fa3bbe96a56151baf3e74df6d989c0a"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.706698 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wb2m4" event={"ID":"c4c3efa1-9114-4b9b-be8b-045c2c4d7928","Type":"ContainerStarted","Data":"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.706799 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wb2m4" event={"ID":"c4c3efa1-9114-4b9b-be8b-045c2c4d7928","Type":"ContainerStarted","Data":"de6d26fbbac9720279ab508d5b64034d462c7d80911d741857e8efd3a2caf100"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.722921 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.735180 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.235143305 +0000 UTC m=+144.253483401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.789538 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z"] Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.799615 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" event={"ID":"a8a59035-5d91-4d4f-970a-68bd142370dc","Type":"ContainerStarted","Data":"6522354836bf20a2cb98626bebf039115412d7bafc3e0c8e52dceac3a79402c3"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.799682 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" event={"ID":"a8a59035-5d91-4d4f-970a-68bd142370dc","Type":"ContainerStarted","Data":"1cacaac4a7f06461f7ee914d8f4a1aa0d5d7610691082986d9998a5903fcaeee"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.799706 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.827490 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w875r" event={"ID":"bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3","Type":"ContainerStarted","Data":"357c2d6cb99de1751430d89943b9f97d19d901f847c5ac5b0b2ce2c3a7e52188"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.827535 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-w875r" event={"ID":"bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3","Type":"ContainerStarted","Data":"1bee808448ff11ee8f72d16e86c32ce759e9a0be8ce4047c300518b4cc334b09"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.828123 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.829413 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.329392771 +0000 UTC m=+144.347732867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.836704 4901 patch_prober.go:28] interesting pod/console-operator-58897d9998-wl2tq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.836758 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" podUID="a8a59035-5d91-4d4f-970a-68bd142370dc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.859366 4901 generic.go:334] "Generic (PLEG): container finished" podID="63cddbc9-0581-4c69-8ecb-7ddec3907b21" containerID="fa2e88f3157d03a38f05ab2f6379a13b81deadfbedd5365fd0eddbe305bce43e" exitCode=0 Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.859484 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" event={"ID":"63cddbc9-0581-4c69-8ecb-7ddec3907b21","Type":"ContainerDied","Data":"fa2e88f3157d03a38f05ab2f6379a13b81deadfbedd5365fd0eddbe305bce43e"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.916839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" event={"ID":"355febd0-19d6-472c-96a3-9f4a3eaa3bc5","Type":"ContainerStarted","Data":"bbde4931db4485bc8f7dba9e721af9c4c1ab60c68ee70ec6097e0e7bc100b174"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.932650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" event={"ID":"f6a4a849-2a99-4268-87ab-9fdcb9a7055c","Type":"ContainerStarted","Data":"3cb575c57c7d9e1174fe0ddd0e34276e50a7d4608f967d9a068bf09946a63777"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.934674 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:56 crc kubenswrapper[4901]: E0202 10:40:56.936882 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.436860548 +0000 UTC m=+144.455200844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.944208 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" event={"ID":"a26bee8d-8a92-4246-80ce-0b1b420b869d","Type":"ContainerStarted","Data":"65f933997e18ecadffca1398ba27acc8a7063a89bdf720a5608d027807a17da4"} Feb 02 10:40:56 crc kubenswrapper[4901]: I0202 10:40:56.948229 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" event={"ID":"19791fe1-25f3-422d-9585-557e8d5a554c","Type":"ContainerStarted","Data":"b1cb4a498f93367078dfc06103de8ffd2803be3f869135ae8bd8de993fa38901"} Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.039716 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.539690437 +0000 UTC m=+144.558030533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.039744 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.040781 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.042017 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.542005731 +0000 UTC m=+144.560345897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.146429 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.146732 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.646694453 +0000 UTC m=+144.665034549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.147373 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.147776 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.647762939 +0000 UTC m=+144.666103035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.248296 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.248656 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.748635401 +0000 UTC m=+144.766975497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.350009 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.350394 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.850380174 +0000 UTC m=+144.868720270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.451017 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.451648 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.951613164 +0000 UTC m=+144.969953260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.451771 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.452263 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:57.952249279 +0000 UTC m=+144.970589375 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.559213 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.561600 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.061547541 +0000 UTC m=+145.079887637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.655299 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-f28zg" podStartSLOduration=122.655259933 podStartE2EDuration="2m2.655259933s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:57.6521479 +0000 UTC m=+144.670487996" watchObservedRunningTime="2026-02-02 10:40:57.655259933 +0000 UTC m=+144.673600039" Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.661398 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.668672 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.16864532 +0000 UTC m=+145.186985416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.699471 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" podStartSLOduration=123.699450177 podStartE2EDuration="2m3.699450177s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:57.698665628 +0000 UTC m=+144.717005744" watchObservedRunningTime="2026-02-02 10:40:57.699450177 +0000 UTC m=+144.717790273" Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.762994 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.764126 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.264109174 +0000 UTC m=+145.282449270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.823688 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-w875r" podStartSLOduration=122.823664491 podStartE2EDuration="2m2.823664491s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:57.82109133 +0000 UTC m=+144.839431426" watchObservedRunningTime="2026-02-02 10:40:57.823664491 +0000 UTC m=+144.842004577" Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.866124 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.866577 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.366546873 +0000 UTC m=+145.384886969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.934844 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jc78c" podStartSLOduration=123.934819065 podStartE2EDuration="2m3.934819065s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:57.930302339 +0000 UTC m=+144.948642435" watchObservedRunningTime="2026-02-02 10:40:57.934819065 +0000 UTC m=+144.953159161" Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.966338 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" event={"ID":"a1665435-f91f-43ca-84ff-dba7fb1c1198","Type":"ContainerStarted","Data":"561290c6c73c59ec4d9c2c77bef5acf082d5a7c299155684ccd18bfe1848fad2"} Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.966390 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" event={"ID":"a1665435-f91f-43ca-84ff-dba7fb1c1198","Type":"ContainerStarted","Data":"5dfee5003d2dc86d668be11fa18f529031d8c8b816b79b341f82becea5c1d53b"} Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.967398 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:57 crc kubenswrapper[4901]: E0202 10:40:57.967840 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.467816185 +0000 UTC m=+145.486156291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.977974 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" event={"ID":"cf9e5479-7683-4748-b571-c7d6c64d149b","Type":"ContainerStarted","Data":"c432c56deb9590ef2b89823c351b7cf00c0825d0d05eeab3e9890954ca894863"} Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.981169 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" event={"ID":"a26bee8d-8a92-4246-80ce-0b1b420b869d","Type":"ContainerStarted","Data":"8760479b3c03c5631a9ef29c0f11b55fa62ea15095597334bd0c49dce24ce5c9"} Feb 02 10:40:57 crc kubenswrapper[4901]: I0202 10:40:57.985933 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" event={"ID":"fcd15108-7039-4f2c-a6be-209f7ffbbc30","Type":"ContainerStarted","Data":"722220c0b79f83453ce5318b0af7043a1613d50183aaca36e4271667d55a2073"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.011142 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" podStartSLOduration=124.011120978 podStartE2EDuration="2m4.011120978s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.003985589 +0000 UTC m=+145.022325695" watchObservedRunningTime="2026-02-02 10:40:58.011120978 +0000 UTC m=+145.029461074" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.012308 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wb2m4" podStartSLOduration=123.012299695 podStartE2EDuration="2m3.012299695s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:57.958064404 +0000 UTC m=+144.976404500" watchObservedRunningTime="2026-02-02 10:40:58.012299695 +0000 UTC m=+145.030639781" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.020594 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" event={"ID":"713da16e-91d1-4bba-af10-4e9a06ef7c81","Type":"ContainerStarted","Data":"598bde8d09bfaee8ccbdfefb85d882e6fc2a54dcc6ca43bd04b5bf007f4e0383"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.027716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" event={"ID":"84b9c922-cd12-451d-b4d2-9dbfcf4d422e","Type":"ContainerStarted","Data":"1e9c6c7570f7ccf9edc8e9e2b37c4f68fba5dd9fe057ed4718e39397aed744df"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.041676 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wcjmj" podStartSLOduration=123.041656708 podStartE2EDuration="2m3.041656708s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.038542645 +0000 UTC m=+145.056882741" watchObservedRunningTime="2026-02-02 10:40:58.041656708 +0000 UTC m=+145.059996804" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.045272 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" event={"ID":"8daa3e41-cbbf-4739-8a26-4a62f3e10636","Type":"ContainerStarted","Data":"b44aed861a5eb46387ca547a5ad15a1e50a90975414873bf1736cddfa642522e"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.059152 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glksr" event={"ID":"882b18e8-9ca2-48b2-9f94-8ac94b54d508","Type":"ContainerStarted","Data":"7c2bcdda7a44383cb3074eaaf189a41954e736243342c5c85fe899e722ecf142"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.073861 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zmx6r" podStartSLOduration=123.073837328 podStartE2EDuration="2m3.073837328s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.073651194 +0000 UTC m=+145.091991290" watchObservedRunningTime="2026-02-02 10:40:58.073837328 +0000 UTC m=+145.092177424" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.086992 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.089340 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.589307913 +0000 UTC m=+145.607648009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.092820 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" event={"ID":"355febd0-19d6-472c-96a3-9f4a3eaa3bc5","Type":"ContainerStarted","Data":"8eda623815c6ac3908028a9b5022e91b8b3f4f6a12068f5d202536cc27ddcdb7"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.094724 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6sw49" event={"ID":"d447069d-acf2-4316-ab5b-2d2692e8f1e6","Type":"ContainerStarted","Data":"824f9b7e151b4c4864d94267f8414c9d724f907992993da52ae6ea30c364bfaa"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.094763 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6sw49" event={"ID":"d447069d-acf2-4316-ab5b-2d2692e8f1e6","Type":"ContainerStarted","Data":"748c08ff23f51902f4aa9945f6c3e3ef70bdc2201c6c3781b68866f04e42ff40"} Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.096624 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.097031 4901 patch_prober.go:28] interesting pod/console-operator-58897d9998-wl2tq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.099478 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" podUID="a8a59035-5d91-4d4f-970a-68bd142370dc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.141367 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-w875r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.141444 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w875r" podUID="bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.145812 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s4wkl" podStartSLOduration=124.145773957 podStartE2EDuration="2m4.145773957s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.142304685 +0000 UTC m=+145.160644791" watchObservedRunningTime="2026-02-02 10:40:58.145773957 +0000 UTC m=+145.164114053" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.190881 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.191713 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.691676051 +0000 UTC m=+145.710016157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.192119 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.194200 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.694157799 +0000 UTC m=+145.712497895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.199481 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.228946 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.229920 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jqqtr" podStartSLOduration=123.229895124 podStartE2EDuration="2m3.229895124s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.175450668 +0000 UTC m=+145.193790764" watchObservedRunningTime="2026-02-02 10:40:58.229895124 +0000 UTC m=+145.248235220" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.233819 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6sw49" podStartSLOduration=123.233807246 podStartE2EDuration="2m3.233807246s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:58.2187499 +0000 UTC m=+145.237089996" watchObservedRunningTime="2026-02-02 10:40:58.233807246 +0000 UTC m=+145.252147342" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.293006 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.293458 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.793423333 +0000 UTC m=+145.811763429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.294389 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.294730 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.794717314 +0000 UTC m=+145.813057410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.361827 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.395309 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.395717 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.895685668 +0000 UTC m=+145.914025764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.396041 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.396388 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.896373555 +0000 UTC m=+145.914713641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.407754 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z5bfv"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.429757 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.429836 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.457488 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65x22"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.457580 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8glpd"] Feb 02 10:40:58 crc kubenswrapper[4901]: W0202 10:40:58.468246 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3c6078_4cbf_4ace_a91c_f48a6910f7c8.slice/crio-8f5e384e4463daf39dcbc091881a82731a95bc0f7578bdbdb43af6092b9e07d6 WatchSource:0}: Error finding container 8f5e384e4463daf39dcbc091881a82731a95bc0f7578bdbdb43af6092b9e07d6: Status 404 returned error can't find the container with id 8f5e384e4463daf39dcbc091881a82731a95bc0f7578bdbdb43af6092b9e07d6 Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.472324 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.494241 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.496867 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.497317 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:58.997294178 +0000 UTC m=+146.015634274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.513152 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.515287 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-79htn"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.517171 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f4k9c"] Feb 02 10:40:58 crc kubenswrapper[4901]: W0202 10:40:58.545468 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a13cb9_6ad7_4585_b49e_ae77af983e38.slice/crio-04ab8fe0e25506cedc9b057bb61b6bff77811b98c2b1480d1992625b23b5a96d WatchSource:0}: Error finding container 04ab8fe0e25506cedc9b057bb61b6bff77811b98c2b1480d1992625b23b5a96d: Status 404 returned error can't find the container with id 04ab8fe0e25506cedc9b057bb61b6bff77811b98c2b1480d1992625b23b5a96d Feb 02 10:40:58 crc kubenswrapper[4901]: W0202 10:40:58.568037 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5524bd4_86b0_4d07_ae14_7d7fa7058955.slice/crio-8c7b738c49910548948283738110de9c5d6ad62821a46f971f36d3a2386d8542 WatchSource:0}: Error finding container 8c7b738c49910548948283738110de9c5d6ad62821a46f971f36d3a2386d8542: Status 404 returned error can't find the container with id 8c7b738c49910548948283738110de9c5d6ad62821a46f971f36d3a2386d8542 Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.601174 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.602082 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.102063243 +0000 UTC m=+146.120403339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: W0202 10:40:58.605821 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929d7edd_8851_427e_a1bd_a8ddd6817e70.slice/crio-f4115d762496ec65a958b34f2e7a26c89743d9973d35ffab4aab44650b2b139a WatchSource:0}: Error finding container f4115d762496ec65a958b34f2e7a26c89743d9973d35ffab4aab44650b2b139a: Status 404 returned error can't find the container with id f4115d762496ec65a958b34f2e7a26c89743d9973d35ffab4aab44650b2b139a Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.649020 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.658238 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.670745 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.675438 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.693859 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.703240 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7nhqx"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.704213 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.705325 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.205127446 +0000 UTC m=+146.223467692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: W0202 10:40:58.735181 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff29d0d0_c1c4_4eb4_bf70_6210af819cb4.slice/crio-6dc08345057f5cc8708f10f7be8a78c0fb9698cbfbec019b78800058b70418af WatchSource:0}: Error finding container 6dc08345057f5cc8708f10f7be8a78c0fb9698cbfbec019b78800058b70418af: Status 404 returned error can't find the container with id 6dc08345057f5cc8708f10f7be8a78c0fb9698cbfbec019b78800058b70418af Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.743610 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc"] Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.806052 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.806799 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.306784797 +0000 UTC m=+146.325124893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.889599 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.889665 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.900484 4901 patch_prober.go:28] interesting pod/apiserver-76f77b778f-9hvh2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]log ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]etcd ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/generic-apiserver-start-informers ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/max-in-flight-filter ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 02 10:40:58 crc kubenswrapper[4901]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/project.openshift.io-projectcache ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-startinformers ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 02 10:40:58 crc kubenswrapper[4901]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 02 10:40:58 crc kubenswrapper[4901]: livez check failed Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.900598 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" podUID="89727a9a-3041-4169-b3b1-0d2840c585ff" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:40:58 crc kubenswrapper[4901]: I0202 10:40:58.908097 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:58 crc kubenswrapper[4901]: E0202 10:40:58.909129 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.409098463 +0000 UTC m=+146.427438559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.023460 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.025486 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.525465411 +0000 UTC m=+146.543805507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.072321 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.080022 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:40:59 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:40:59 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:40:59 crc kubenswrapper[4901]: healthz check failed Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.080086 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.116795 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" event={"ID":"d5524bd4-86b0-4d07-ae14-7d7fa7058955","Type":"ContainerStarted","Data":"8c7b738c49910548948283738110de9c5d6ad62821a46f971f36d3a2386d8542"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.138286 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.138638 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.638618613 +0000 UTC m=+146.656958699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.146838 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" event={"ID":"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0","Type":"ContainerStarted","Data":"d620f94b28b3134de43abb1c36d674451b76eb897c2ef391fd52e59ad64ff386"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.157370 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" event={"ID":"84b9c922-cd12-451d-b4d2-9dbfcf4d422e","Type":"ContainerStarted","Data":"cf94d5aba6d71eb769b6306d8cc34939387cba290187d84107ca85a584d9fd8c"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.181237 4901 generic.go:334] "Generic (PLEG): container finished" podID="19791fe1-25f3-422d-9585-557e8d5a554c" containerID="b044b860c7e174f608574e1a1e147310f5c642d994ffb4f6bd206d1dd0d15ef2" exitCode=0 Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.181380 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" event={"ID":"19791fe1-25f3-422d-9585-557e8d5a554c","Type":"ContainerDied","Data":"b044b860c7e174f608574e1a1e147310f5c642d994ffb4f6bd206d1dd0d15ef2"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.205305 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kw755" podStartSLOduration=124.205263537 podStartE2EDuration="2m4.205263537s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.200003063 +0000 UTC m=+146.218343169" watchObservedRunningTime="2026-02-02 10:40:59.205263537 +0000 UTC m=+146.223603633" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.216860 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" event={"ID":"118f2636-94f9-40a7-90e7-d48df737a551","Type":"ContainerStarted","Data":"22f2cc0e33447c6f1a7fe8cf6a81978e021ad6950e5cbfb07c5aa57b487158be"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.224744 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" event={"ID":"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8","Type":"ContainerStarted","Data":"8f5e384e4463daf39dcbc091881a82731a95bc0f7578bdbdb43af6092b9e07d6"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.241583 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.242256 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.74223442 +0000 UTC m=+146.760574516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.254599 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" event={"ID":"81e78715-5b78-44d5-b225-df11f642c082","Type":"ContainerStarted","Data":"d48ec3aa2d9550cbd649722d96c4ad233ca6ef41e9a5196c200a502a888cbf0c"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.255071 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" event={"ID":"81e78715-5b78-44d5-b225-df11f642c082","Type":"ContainerStarted","Data":"124356cb47a3984aa0e4894792affd0187ae45254cb0b67db3a328d8f6aa9b67"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.318839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" event={"ID":"8a686a95-f3bb-4edb-aae8-86995516c3ff","Type":"ContainerStarted","Data":"9fb86912ae660a69bda49bdbe512343e32c0440972e03bb160a1e41a16f9ecde"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.330842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" event={"ID":"a4ab49c1-659f-466e-b480-a641b964ab2a","Type":"ContainerStarted","Data":"189358ded859a465384284354740ffb55e23c9d6fc146f5d2ca6532764849f40"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.336537 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" event={"ID":"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7","Type":"ContainerStarted","Data":"e77436e99a91e52a7b4e24520649f82fab9da33e76de54fdea8cee6349d3dae8"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.336601 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" event={"ID":"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7","Type":"ContainerStarted","Data":"210caa31d2202c22641a7e639e72a57a75f780a3c2a549b7996299354dbcfe6c"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.340013 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" event={"ID":"658007dc-27de-4c38-b415-eb8eaa96d752","Type":"ContainerStarted","Data":"80854c88594d2f9f3c0284c3ed6c543d3660cc302bd2cd31ceaebf708e4fb4b3"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.340062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" event={"ID":"658007dc-27de-4c38-b415-eb8eaa96d752","Type":"ContainerStarted","Data":"dbc232f1cbe8cbd6a91137f5bf31f6e458c119ee44f3541700178206748e7ca8"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.344501 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.346606 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.846579574 +0000 UTC m=+146.864919670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.350971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" event={"ID":"8a24390c-720d-4e6b-b3d7-a12eab3d72a6","Type":"ContainerStarted","Data":"eac4354fd4bf58478502b9271d62ef87f6b907a82dac2c69198c233177577e05"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.366058 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gns9g" podStartSLOduration=124.366018713 podStartE2EDuration="2m4.366018713s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.365765187 +0000 UTC m=+146.384105283" watchObservedRunningTime="2026-02-02 10:40:59.366018713 +0000 UTC m=+146.384358809" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.379988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" event={"ID":"8daa3e41-cbbf-4739-8a26-4a62f3e10636","Type":"ContainerStarted","Data":"2bcde1c812c9fcf64daf4fcf0b815384a24a6982b0bf93be9695836fe3e824d9"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.387098 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" event={"ID":"cf9e5479-7683-4748-b571-c7d6c64d149b","Type":"ContainerStarted","Data":"4003b83d20be9c29cbb37cec3d51dae2acc5edea96359a446dca230347e0480d"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.401395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65x22" event={"ID":"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7","Type":"ContainerStarted","Data":"58cfafc6283ca796a07c833aa5a989f653cfe29795f49a06ae166119edf5a537"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.418449 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8sgkj" podStartSLOduration=125.418426811 podStartE2EDuration="2m5.418426811s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.418159095 +0000 UTC m=+146.436499191" watchObservedRunningTime="2026-02-02 10:40:59.418426811 +0000 UTC m=+146.436766907" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.428963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" event={"ID":"929d7edd-8851-427e-a1bd-a8ddd6817e70","Type":"ContainerStarted","Data":"f4115d762496ec65a958b34f2e7a26c89743d9973d35ffab4aab44650b2b139a"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.450687 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.451748 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pj9q5" podStartSLOduration=124.451733968 podStartE2EDuration="2m4.451733968s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.449979366 +0000 UTC m=+146.468319462" watchObservedRunningTime="2026-02-02 10:40:59.451733968 +0000 UTC m=+146.470074064" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.451948 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:40:59.951933332 +0000 UTC m=+146.970273428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.460159 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" event={"ID":"63cddbc9-0581-4c69-8ecb-7ddec3907b21","Type":"ContainerStarted","Data":"0e682abfdda81e449de021627d5fe8fb49782a775e68ca74207d013570e5a93c"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.461233 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.498207 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" event={"ID":"fcd15108-7039-4f2c-a6be-209f7ffbbc30","Type":"ContainerStarted","Data":"0ddb8b045fab7427511e861c6bf84e55298ab3cf2035b4858b453f2862e29418"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.500033 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.503328 4901 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mpwqj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.503381 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.523450 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" podStartSLOduration=124.52341491 podStartE2EDuration="2m4.52341491s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.519329374 +0000 UTC m=+146.537669490" watchObservedRunningTime="2026-02-02 10:40:59.52341491 +0000 UTC m=+146.541754996" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.524706 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" podStartSLOduration=125.524695711 podStartE2EDuration="2m5.524695711s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.490550184 +0000 UTC m=+146.508890280" watchObservedRunningTime="2026-02-02 10:40:59.524695711 +0000 UTC m=+146.543035807" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.524949 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" event={"ID":"1db2f103-d3aa-45f5-acd6-d70543968d36","Type":"ContainerStarted","Data":"11977572fd750249c57d44f863bfd5f6b2b0148613b046fcecad216397586fdb"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.542281 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8glpd" event={"ID":"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b","Type":"ContainerStarted","Data":"6b414d69f54417992347598fd7eb00d772ee8830904acdd967acb721731fbaf4"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.542856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8glpd" event={"ID":"bd8d2e50-5006-4cc8-bfaa-fdbdc245053b","Type":"ContainerStarted","Data":"7d9e0d5f4303401a987c544c4f6d634e3b98cf6f71ca1ba2a8390317cacfb3f6"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.550305 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" event={"ID":"4697c668-acff-4c8d-b562-e6491a9cbdd0","Type":"ContainerStarted","Data":"f4a90d2563fdd75501b02d64e77fce3d73a2c3f793f2de5d873bb0cf568d680a"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.550438 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" event={"ID":"4697c668-acff-4c8d-b562-e6491a9cbdd0","Type":"ContainerStarted","Data":"bb6368ba91564b0943b7f7762b12f4c9b95cbce65424727e8fc6c6ae672ad326"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.551608 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.552280 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.552490 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.052449345 +0000 UTC m=+147.070789441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.553776 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xnv4q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.553831 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.555254 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.558610 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.058593151 +0000 UTC m=+147.076933247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.560215 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" event={"ID":"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4","Type":"ContainerStarted","Data":"6dc08345057f5cc8708f10f7be8a78c0fb9698cbfbec019b78800058b70418af"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.566543 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8glpd" podStartSLOduration=6.566515798 podStartE2EDuration="6.566515798s" podCreationTimestamp="2026-02-02 10:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.55854255 +0000 UTC m=+146.576882646" watchObservedRunningTime="2026-02-02 10:40:59.566515798 +0000 UTC m=+146.584855894" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.597986 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" podStartSLOduration=124.59796285 podStartE2EDuration="2m4.59796285s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.597028798 +0000 UTC m=+146.615368894" watchObservedRunningTime="2026-02-02 10:40:59.59796285 +0000 UTC m=+146.616302946" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.602326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" event={"ID":"93a1ed7b-a791-4fb9-b02b-8280b107789a","Type":"ContainerStarted","Data":"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.602409 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" event={"ID":"93a1ed7b-a791-4fb9-b02b-8280b107789a","Type":"ContainerStarted","Data":"8fe150db1f598b0222b74f861d86824cc693c5f750a4f9535f2f6b3aedcf951b"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.602766 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.609963 4901 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tsxn7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.610032 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.644899 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-glksr" event={"ID":"882b18e8-9ca2-48b2-9f94-8ac94b54d508","Type":"ContainerStarted","Data":"6ed3b175a68e151e5a387781d8dcee9d6d8d90ecf2c02d14918889c486db2df2"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.657556 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.659758 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.159732609 +0000 UTC m=+147.178072715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.672629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" event={"ID":"3b78e424-eccd-4efa-9b7c-b59ca43bef39","Type":"ContainerStarted","Data":"f570458350eeeb40c9962ef028f18c65979576030409b8c0f6c03cabff2fd92e"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.696996 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" podStartSLOduration=125.696965759 podStartE2EDuration="2m5.696965759s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.671919297 +0000 UTC m=+146.690259383" watchObservedRunningTime="2026-02-02 10:40:59.696965759 +0000 UTC m=+146.715305855" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.742001 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-glksr" podStartSLOduration=7.741972971 podStartE2EDuration="7.741972971s" podCreationTimestamp="2026-02-02 10:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.714235846 +0000 UTC m=+146.732575982" watchObservedRunningTime="2026-02-02 10:40:59.741972971 +0000 UTC m=+146.760313067" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.747712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" event={"ID":"713da16e-91d1-4bba-af10-4e9a06ef7c81","Type":"ContainerStarted","Data":"887152f93a74ce667567436c069bf9ea4d98d126433aa2aee1e4bba3a8cdfb88"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.750174 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" event={"ID":"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6","Type":"ContainerStarted","Data":"b06cfd6d22b1cbb6b0e5247d5fe4834b57ee5bca41da45773f6504b1f1bbdb41"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.753619 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" event={"ID":"01a13cb9-6ad7-4585-b49e-ae77af983e38","Type":"ContainerStarted","Data":"7555dc82790d017db430e82ebe2accbff7a10b3354752b7574563d26b16e3519"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.753651 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" event={"ID":"01a13cb9-6ad7-4585-b49e-ae77af983e38","Type":"ContainerStarted","Data":"04ab8fe0e25506cedc9b057bb61b6bff77811b98c2b1480d1992625b23b5a96d"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.760601 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" event={"ID":"9cc14a9d-2037-4a08-8640-ebedde61adfa","Type":"ContainerStarted","Data":"e5f122b66945452349d4a0c5d99cd18f043c518c05fed5052df4e784a25e4b36"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.760653 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" event={"ID":"9cc14a9d-2037-4a08-8640-ebedde61adfa","Type":"ContainerStarted","Data":"9387d54323a03586fa792f488eaecdc546392a5241e4fd3c89c5747fd2955343"} Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.762350 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-w875r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.762417 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w875r" podUID="bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.762624 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.765899 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.265851625 +0000 UTC m=+147.284191721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.816199 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2qp9z" podStartSLOduration=124.816177254 podStartE2EDuration="2m4.816177254s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.784893334 +0000 UTC m=+146.803233430" watchObservedRunningTime="2026-02-02 10:40:59.816177254 +0000 UTC m=+146.834517350" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.816360 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kvfzd" podStartSLOduration=124.816355358 podStartE2EDuration="2m4.816355358s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:40:59.815282142 +0000 UTC m=+146.833622238" watchObservedRunningTime="2026-02-02 10:40:59.816355358 +0000 UTC m=+146.834695454" Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.864002 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.864729 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.364697429 +0000 UTC m=+147.383037515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:40:59 crc kubenswrapper[4901]: I0202 10:40:59.966358 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:40:59 crc kubenswrapper[4901]: E0202 10:40:59.966726 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.466712178 +0000 UTC m=+147.485052274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.074276 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.075120 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.575099948 +0000 UTC m=+147.593440044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.087532 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:00 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:41:00 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:00 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.087601 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.176942 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.177442 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.677417854 +0000 UTC m=+147.695757950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.278514 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.278741 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.778703206 +0000 UTC m=+147.797043322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.279163 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.279455 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.779442614 +0000 UTC m=+147.797782710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.379856 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.380029 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.879984638 +0000 UTC m=+147.898324734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.380250 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.380754 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.880742686 +0000 UTC m=+147.899082772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.482091 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.482332 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:00.982289374 +0000 UTC m=+148.000629470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.584417 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.584948 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.084926448 +0000 UTC m=+148.103266544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.598425 4901 csr.go:261] certificate signing request csr-2zdkr is approved, waiting to be issued Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.609410 4901 csr.go:257] certificate signing request csr-2zdkr is issued Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.686461 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.686835 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.186797364 +0000 UTC m=+148.205137620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.767645 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" event={"ID":"118f2636-94f9-40a7-90e7-d48df737a551","Type":"ContainerStarted","Data":"b01a6f818c25a96a48f7d91c95f02d31f89c5a46644ab0159017671becc77818"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.767875 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.769768 4901 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wjt2l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.769833 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" podUID="118f2636-94f9-40a7-90e7-d48df737a551" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.771137 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" event={"ID":"d5524bd4-86b0-4d07-ae14-7d7fa7058955","Type":"ContainerStarted","Data":"1f04d67007f40d7f1360ec7eb0b0a54a118e4b9640e20f1405d66d580b9f62a9"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.773072 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" event={"ID":"8a24390c-720d-4e6b-b3d7-a12eab3d72a6","Type":"ContainerStarted","Data":"2b239683c821bfe191569f242fe16d844f67e4ac9864792d6f6ba4778e203ae7"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.775510 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" event={"ID":"8a686a95-f3bb-4edb-aae8-86995516c3ff","Type":"ContainerStarted","Data":"e97445f4a7e4cd2e057ff380d3c304b5348416a1fe6d3cfdec5b5bbbb56b0793"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.776884 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" event={"ID":"929d7edd-8851-427e-a1bd-a8ddd6817e70","Type":"ContainerStarted","Data":"c1c5f15f6150977ab28a7912aa182a2ad9db45a7b20056c6b663c87328be0690"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.777259 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.778499 4901 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8xlmd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.778542 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" podUID="929d7edd-8851-427e-a1bd-a8ddd6817e70" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.781749 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" event={"ID":"a4ab49c1-659f-466e-b480-a641b964ab2a","Type":"ContainerStarted","Data":"2ebae78a07fc12ef431ac3d3c0804199a861e5c3941a41120600da83cb66683e"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.781954 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.783456 4901 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5pfqc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.783498 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" podUID="a4ab49c1-659f-466e-b480-a641b964ab2a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.786920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" event={"ID":"3b78e424-eccd-4efa-9b7c-b59ca43bef39","Type":"ContainerStarted","Data":"af0e646e97e1c995f8be406cec9b8943dac4c8938e3b97c8b2e6273937fbbc9f"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.787870 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.788252 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.288239229 +0000 UTC m=+148.306579325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.791499 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" event={"ID":"19791fe1-25f3-422d-9585-557e8d5a554c","Type":"ContainerStarted","Data":"7d1361b60e98c325cb36a9a58ce32eaca005044e8ba7bf87eca90dfda11e0b55"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.793451 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" event={"ID":"00cc40f1-d5d4-4fc0-ad0f-b7c6130af2f6","Type":"ContainerStarted","Data":"287c55e8e85b504ef100ba0e02ee99343c09b01c27d88835ea8b7e83afd516c8"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.795138 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" event={"ID":"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8","Type":"ContainerStarted","Data":"6fe12333a75e8e9ccc23c762d360dee4c20e7c5367e46f9088b32d00175aca4a"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.795162 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" event={"ID":"bc3c6078-4cbf-4ace-a91c-f48a6910f7c8","Type":"ContainerStarted","Data":"3b0b7450845461efc0da98f8d23371f79952dd157d7cede9eb5ae368f6a4ab87"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.797130 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" event={"ID":"a8271eaa-0795-430a-b3b3-bc8b3ffb53b7","Type":"ContainerStarted","Data":"5ee0bbd09d3409df03efbd2af93ae68778b4a643377562c69f94da6834a5932d"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.799047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" event={"ID":"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4","Type":"ContainerStarted","Data":"29a4aaefcc3de3cfcd1c3b076c70ab16f0d694b54e222c15d5c2decf64bc61cb"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.799074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" event={"ID":"ff29d0d0-c1c4-4eb4-bf70-6210af819cb4","Type":"ContainerStarted","Data":"112858f1d0cef6a92b18357e38d68d353c4ea48e08dc30327fd79b93c83ec346"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.799455 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.800922 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" event={"ID":"1db2f103-d3aa-45f5-acd6-d70543968d36","Type":"ContainerStarted","Data":"33974ee79ce75d215ca5b2c5e3552197efaa4e8a2ac3d871fabea5b92460a55c"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.800948 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" event={"ID":"1db2f103-d3aa-45f5-acd6-d70543968d36","Type":"ContainerStarted","Data":"468d151528eb056fe98d8dbf1b791cb80e8045e30dcc4cdd73bbe4b1bd7c77a5"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.802837 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" event={"ID":"9cc14a9d-2037-4a08-8640-ebedde61adfa","Type":"ContainerStarted","Data":"9292a75bca3b46007e1e158bd26ca4a8f061eea48508b1f57d1f4503931a96ab"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.805940 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" event={"ID":"81e78715-5b78-44d5-b225-df11f642c082","Type":"ContainerStarted","Data":"734cc9ebde65268b8755cb018d1d9e94724973d8bf7002cba56f160a64fd9e93"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.808462 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" event={"ID":"ef3abb69-4ee9-487f-a1d3-197fe16c5fb0","Type":"ContainerStarted","Data":"b6ff8141877dbcf4314626af0a8c074d96a2e39b55cecdeb2b485ffba3f28b00"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.811822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65x22" event={"ID":"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7","Type":"ContainerStarted","Data":"eb78354c09ca7385937aa6ecbd1fdfa7579ef2992598a8c98d9fffeb2d1deca2"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.811869 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65x22" event={"ID":"bc898e00-9c24-4c1e-974b-cfc7d2b48bc7","Type":"ContainerStarted","Data":"916d47fb13009ef85cfa2b745d06588397375a544429030c36cf7efc3b623c77"} Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.812482 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xnv4q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.812525 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.812999 4901 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tsxn7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.813060 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.821151 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.836242 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-z5bfv" podStartSLOduration=125.836218562 podStartE2EDuration="2m5.836218562s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.833550158 +0000 UTC m=+147.851890254" watchObservedRunningTime="2026-02-02 10:41:00.836218562 +0000 UTC m=+147.854558658" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.836868 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" podStartSLOduration=125.836859587 podStartE2EDuration="2m5.836859587s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.807019683 +0000 UTC m=+147.825359779" watchObservedRunningTime="2026-02-02 10:41:00.836859587 +0000 UTC m=+147.855199683" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.873441 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" podStartSLOduration=125.87342012 podStartE2EDuration="2m5.87342012s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.873328478 +0000 UTC m=+147.891668574" watchObservedRunningTime="2026-02-02 10:41:00.87342012 +0000 UTC m=+147.891760216" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.890026 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.892536 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.39245458 +0000 UTC m=+148.410794696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.963840 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-krwxc" podStartSLOduration=125.963812385 podStartE2EDuration="2m5.963812385s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.958144391 +0000 UTC m=+147.976484487" watchObservedRunningTime="2026-02-02 10:41:00.963812385 +0000 UTC m=+147.982152481" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.963953 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wkpqq" podStartSLOduration=125.963949359 podStartE2EDuration="2m5.963949359s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.904556065 +0000 UTC m=+147.922896161" watchObservedRunningTime="2026-02-02 10:41:00.963949359 +0000 UTC m=+147.982289455" Feb 02 10:41:00 crc kubenswrapper[4901]: I0202 10:41:00.993632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:00 crc kubenswrapper[4901]: E0202 10:41:00.995521 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.495498923 +0000 UTC m=+148.513839019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.035421 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-klpkc" podStartSLOduration=126.035381906 podStartE2EDuration="2m6.035381906s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:00.992492232 +0000 UTC m=+148.010832328" watchObservedRunningTime="2026-02-02 10:41:01.035381906 +0000 UTC m=+148.053722002" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.036143 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-85dm9" podStartSLOduration=126.036139143 podStartE2EDuration="2m6.036139143s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.032777004 +0000 UTC m=+148.051117100" watchObservedRunningTime="2026-02-02 10:41:01.036139143 +0000 UTC m=+148.054479239" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.059020 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" podStartSLOduration=126.058995022 podStartE2EDuration="2m6.058995022s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.058183754 +0000 UTC m=+148.076523850" watchObservedRunningTime="2026-02-02 10:41:01.058995022 +0000 UTC m=+148.077335118" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.077270 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:01 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:41:01 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:01 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.077375 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.094740 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.095156 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.595137996 +0000 UTC m=+148.613478092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.107783 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" podStartSLOduration=126.107760254 podStartE2EDuration="2m6.107760254s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.08472006 +0000 UTC m=+148.103060156" watchObservedRunningTime="2026-02-02 10:41:01.107760254 +0000 UTC m=+148.126100340" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.151599 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" podStartSLOduration=126.15157462 podStartE2EDuration="2m6.15157462s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.150156136 +0000 UTC m=+148.168496232" watchObservedRunningTime="2026-02-02 10:41:01.15157462 +0000 UTC m=+148.169914716" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.153094 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-79htn" podStartSLOduration=126.153088045 podStartE2EDuration="2m6.153088045s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.108773378 +0000 UTC m=+148.127113484" watchObservedRunningTime="2026-02-02 10:41:01.153088045 +0000 UTC m=+148.171428141" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.174265 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" podStartSLOduration=126.174243225 podStartE2EDuration="2m6.174243225s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.170716841 +0000 UTC m=+148.189056937" watchObservedRunningTime="2026-02-02 10:41:01.174243225 +0000 UTC m=+148.192583321" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.197463 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.197856 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.697840672 +0000 UTC m=+148.716180768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.200499 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-nv578" podStartSLOduration=127.200484335 podStartE2EDuration="2m7.200484335s" podCreationTimestamp="2026-02-02 10:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.192435164 +0000 UTC m=+148.210775260" watchObservedRunningTime="2026-02-02 10:41:01.200484335 +0000 UTC m=+148.218824431" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.214608 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7nhqx" podStartSLOduration=126.214590208 podStartE2EDuration="2m6.214590208s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.210402879 +0000 UTC m=+148.228742975" watchObservedRunningTime="2026-02-02 10:41:01.214590208 +0000 UTC m=+148.232930304" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.239914 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-zt89j" podStartSLOduration=126.239892015 podStartE2EDuration="2m6.239892015s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.2367484 +0000 UTC m=+148.255088506" watchObservedRunningTime="2026-02-02 10:41:01.239892015 +0000 UTC m=+148.258232111" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.298224 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.298435 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.798372675 +0000 UTC m=+148.816712771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.298868 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.299474 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.799464471 +0000 UTC m=+148.817804567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.328591 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-65x22" podStartSLOduration=9.328556758 podStartE2EDuration="9.328556758s" podCreationTimestamp="2026-02-02 10:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:01.327989576 +0000 UTC m=+148.346329672" watchObservedRunningTime="2026-02-02 10:41:01.328556758 +0000 UTC m=+148.346896854" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.400159 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.400636 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:01.900597999 +0000 UTC m=+148.918938265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.502326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.502927 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.002904446 +0000 UTC m=+149.021244542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.603990 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.604096 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.104076095 +0000 UTC m=+149.122416191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.604370 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.604681 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.104673499 +0000 UTC m=+149.123013595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.611116 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-02 10:36:00 +0000 UTC, rotation deadline is 2026-12-19 10:31:34.380881086 +0000 UTC Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.611155 4901 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7679h50m32.769728834s for next certificate rotation Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.705485 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.705675 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.705711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.705772 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.705795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.706556 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.206526374 +0000 UTC m=+149.224866470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.712586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.713674 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.719047 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.734723 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.807579 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.808173 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.308153105 +0000 UTC m=+149.326493221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.817027 4901 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xnv4q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.817097 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.817554 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-65x22" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.839689 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2pdrw" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.841144 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wjt2l" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.848512 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5pfqc" Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.909454 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.909596 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.409552219 +0000 UTC m=+149.427892315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.910125 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:01 crc kubenswrapper[4901]: E0202 10:41:01.921307 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.421287026 +0000 UTC m=+149.439627122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:01 crc kubenswrapper[4901]: I0202 10:41:01.994949 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.002279 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.016078 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.016746 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.017358 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.517331264 +0000 UTC m=+149.535671360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.075717 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:02 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:41:02 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:02 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.075786 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.123478 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.123836 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.623821719 +0000 UTC m=+149.642161815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.224647 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.225257 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.725228163 +0000 UTC m=+149.743568259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.225518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.225984 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.725965231 +0000 UTC m=+149.744305327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.326709 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.326906 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.826875334 +0000 UTC m=+149.845215430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.326961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.327465 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.827457217 +0000 UTC m=+149.845797313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.435888 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.436073 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.936045642 +0000 UTC m=+149.954385738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.436217 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.436577 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:02.936550093 +0000 UTC m=+149.954890189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.537315 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.537644 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.037610311 +0000 UTC m=+150.055950407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.538170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.538536 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.038527792 +0000 UTC m=+150.056867888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.639214 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.639370 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.139346753 +0000 UTC m=+150.157686849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.639525 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.640012 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.140002809 +0000 UTC m=+150.158342895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.740531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.740899 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.240867271 +0000 UTC m=+150.259207367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.741092 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.741494 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.241487175 +0000 UTC m=+150.259827271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.804387 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8xlmd" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.841940 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.842231 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.342212524 +0000 UTC m=+150.360552620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.842918 4901 generic.go:334] "Generic (PLEG): container finished" podID="8a24390c-720d-4e6b-b3d7-a12eab3d72a6" containerID="2b239683c821bfe191569f242fe16d844f67e4ac9864792d6f6ba4778e203ae7" exitCode=0 Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.843007 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" event={"ID":"8a24390c-720d-4e6b-b3d7-a12eab3d72a6","Type":"ContainerDied","Data":"2b239683c821bfe191569f242fe16d844f67e4ac9864792d6f6ba4778e203ae7"} Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.862692 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" event={"ID":"3b78e424-eccd-4efa-9b7c-b59ca43bef39","Type":"ContainerStarted","Data":"f8957b0b88a3304cfb9cf6e5e0c26c3dce4a9f0bccfc3df544a1acc651558b35"} Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.928538 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.930226 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.933728 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.945311 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:02 crc kubenswrapper[4901]: I0202 10:41:02.946295 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:41:02 crc kubenswrapper[4901]: E0202 10:41:02.947648 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.447633983 +0000 UTC m=+150.465974079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.046512 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.046802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56wgf\" (UniqueName: \"kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.046838 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.046931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.047088 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.547066921 +0000 UTC m=+150.565407017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.079096 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:03 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:41:03 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:03 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.079154 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.119418 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.120499 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.129046 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.133067 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.148014 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.148072 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56wgf\" (UniqueName: \"kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.148093 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.148130 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.148398 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.648385344 +0000 UTC m=+150.666725430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.148518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.149159 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.171238 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56wgf\" (UniqueName: \"kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf\") pod \"community-operators-b9p28\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.239104 4901 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.249833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.250107 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.250152 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.750108706 +0000 UTC m=+150.768448802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.250209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrt8\" (UniqueName: \"kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.250295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.250410 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.250850 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.750831423 +0000 UTC m=+150.769171519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.276594 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:03 crc kubenswrapper[4901]: W0202 10:41:03.281557 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-df0129dc2827805c11f844aa7f3ec260012d4d7d574e9e1542a466701cad2555 WatchSource:0}: Error finding container df0129dc2827805c11f844aa7f3ec260012d4d7d574e9e1542a466701cad2555: Status 404 returned error can't find the container with id df0129dc2827805c11f844aa7f3ec260012d4d7d574e9e1542a466701cad2555 Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.321400 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.322946 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.344161 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.353612 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.354242 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.354306 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrt8\" (UniqueName: \"kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.354350 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.354918 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.355032 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.855008023 +0000 UTC m=+150.873348119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.355292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.382304 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrt8\" (UniqueName: \"kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8\") pod \"certified-operators-79vj8\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: W0202 10:41:03.418483 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-a771f9fb6d6ccbd3e78d296baf45748195f1db67beecfca77b97a21275dfa03b WatchSource:0}: Error finding container a771f9fb6d6ccbd3e78d296baf45748195f1db67beecfca77b97a21275dfa03b: Status 404 returned error can't find the container with id a771f9fb6d6ccbd3e78d296baf45748195f1db67beecfca77b97a21275dfa03b Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.450942 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.455933 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.456000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctjb\" (UniqueName: \"kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.456026 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.456078 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.456348 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:03.956326206 +0000 UTC m=+150.974666482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.514462 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.515496 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.523929 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.530826 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.557259 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.557435 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.057392913 +0000 UTC m=+151.075733009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.557477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctjb\" (UniqueName: \"kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.557540 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.557677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.557777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.558079 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.058072218 +0000 UTC m=+151.076412314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.560477 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.560936 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.585040 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctjb\" (UniqueName: \"kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb\") pod \"community-operators-825gs\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.642574 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.659198 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.659542 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.659622 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.659644 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxrf\" (UniqueName: \"kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.659790 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.15977248 +0000 UTC m=+151.178112576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.767896 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.767957 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.767992 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.768017 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxrf\" (UniqueName: \"kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.768867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.769129 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.269113292 +0000 UTC m=+151.287453388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.769159 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.792448 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxrf\" (UniqueName: \"kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf\") pod \"certified-operators-s2t7g\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.808323 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.840298 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.870267 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.870489 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.370454615 +0000 UTC m=+151.388794711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.870555 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:03 crc kubenswrapper[4901]: E0202 10:41:03.870957 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:41:04.370941227 +0000 UTC m=+151.389281493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xvfbq" (UID: "dc8db928-4418-4963-892f-df5413ed2c76") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.872812 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerStarted","Data":"53be6de38fff7f905a38af64cf46a8058746290eab9a2fe65039ef8d7dcab75c"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.877391 4901 generic.go:334] "Generic (PLEG): container finished" podID="da3848e5-a20f-4124-856b-d860bea45325" containerID="d536775a9b671cb7b2ed51f77b81b3b401bf6a1cb9da261acc1efcdae8d6a3e6" exitCode=0 Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.877444 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerDied","Data":"d536775a9b671cb7b2ed51f77b81b3b401bf6a1cb9da261acc1efcdae8d6a3e6"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.877465 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerStarted","Data":"163212c7f0c81016eef0d0f8d18f9e60e6794b4afe27eb9308ccf440596f2df9"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.879941 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f5dd1cc3f82add80ce900dd40052ad6464caec3aa0be76f32329a973e1738ab9"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.879991 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5d70f29c7463f4db8cd8a7b177c4c6d53727504c67e9dc89db56151f8ab671ec"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.882370 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7f163bcbd000eebcd314e6887f746417089c5afb4a1350f37200629a1fc6854e"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.882418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a771f9fb6d6ccbd3e78d296baf45748195f1db67beecfca77b97a21275dfa03b"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.882882 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.884016 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2cb4907aab9a477cb26bea5ff549a2d5b296e3fbbcf885d58de8c8fbc8803bc5"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.884037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"df0129dc2827805c11f844aa7f3ec260012d4d7d574e9e1542a466701cad2555"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.885216 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.888126 4901 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-02T10:41:03.239485305Z","Handler":null,"Name":""} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.901479 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.907741 4901 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.907777 4901 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.914172 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-9hvh2" Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.916837 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" event={"ID":"3b78e424-eccd-4efa-9b7c-b59ca43bef39","Type":"ContainerStarted","Data":"f5df5d3f3b7de8f56c9458248f468e23b519e00c23bbd1247e8c5fc03b50668c"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.916862 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" event={"ID":"3b78e424-eccd-4efa-9b7c-b59ca43bef39","Type":"ContainerStarted","Data":"1bbfa5baafabfbfd2f1efebdca6b874e8cc0a090cb99fbb4bf8936127bc1fee4"} Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.971778 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.979336 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:41:03 crc kubenswrapper[4901]: I0202 10:41:03.981516 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.073310 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.082877 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:04 crc kubenswrapper[4901]: [-]has-synced failed: reason withheld Feb 02 10:41:04 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:04 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.082963 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.127110 4901 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.127175 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.144625 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-f4k9c" podStartSLOduration=11.144600659 podStartE2EDuration="11.144600659s" podCreationTimestamp="2026-02-02 10:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:04.139018797 +0000 UTC m=+151.157358893" watchObservedRunningTime="2026-02-02 10:41:04.144600659 +0000 UTC m=+151.162940755" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.193821 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xvfbq\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.285593 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.286956 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:41:04 crc kubenswrapper[4901]: W0202 10:41:04.290488 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81767c78_c6a0_4a68_ab07_98eeaf3e9483.slice/crio-35582ec7704707298bfe5ff107b9deb2630b61845b89ca0f850439384b825fe8 WatchSource:0}: Error finding container 35582ec7704707298bfe5ff107b9deb2630b61845b89ca0f850439384b825fe8: Status 404 returned error can't find the container with id 35582ec7704707298bfe5ff107b9deb2630b61845b89ca0f850439384b825fe8 Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.360849 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.385225 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcf7g\" (UniqueName: \"kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g\") pod \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.385854 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume\") pod \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.385888 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume\") pod \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\" (UID: \"8a24390c-720d-4e6b-b3d7-a12eab3d72a6\") " Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.387225 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a24390c-720d-4e6b-b3d7-a12eab3d72a6" (UID: "8a24390c-720d-4e6b-b3d7-a12eab3d72a6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.391256 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g" (OuterVolumeSpecName: "kube-api-access-vcf7g") pod "8a24390c-720d-4e6b-b3d7-a12eab3d72a6" (UID: "8a24390c-720d-4e6b-b3d7-a12eab3d72a6"). InnerVolumeSpecName "kube-api-access-vcf7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.393837 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a24390c-720d-4e6b-b3d7-a12eab3d72a6" (UID: "8a24390c-720d-4e6b-b3d7-a12eab3d72a6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.490665 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcf7g\" (UniqueName: \"kubernetes.io/projected/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-kube-api-access-vcf7g\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.490704 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.490717 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a24390c-720d-4e6b-b3d7-a12eab3d72a6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.628275 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.917295 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:41:04 crc kubenswrapper[4901]: E0202 10:41:04.917881 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a24390c-720d-4e6b-b3d7-a12eab3d72a6" containerName="collect-profiles" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.917927 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a24390c-720d-4e6b-b3d7-a12eab3d72a6" containerName="collect-profiles" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.918137 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a24390c-720d-4e6b-b3d7-a12eab3d72a6" containerName="collect-profiles" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.919588 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.921299 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.926782 4901 generic.go:334] "Generic (PLEG): container finished" podID="ebacceb9-418b-4af4-9511-007595694dc2" containerID="e58405f3c5f037c020c15bdf8125eeb272a81339343153720f17b3c34ec4a4e4" exitCode=0 Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.926881 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerDied","Data":"e58405f3c5f037c020c15bdf8125eeb272a81339343153720f17b3c34ec4a4e4"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.927090 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.928960 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.928958 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7" event={"ID":"8a24390c-720d-4e6b-b3d7-a12eab3d72a6","Type":"ContainerDied","Data":"eac4354fd4bf58478502b9271d62ef87f6b907a82dac2c69198c233177577e05"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.928993 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac4354fd4bf58478502b9271d62ef87f6b907a82dac2c69198c233177577e05" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.931071 4901 generic.go:334] "Generic (PLEG): container finished" podID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerID="49bb46b9a34f7b014db97c50d9c33660947609a2fefe4de013ad9c9ef02e27e6" exitCode=0 Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.931146 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerDied","Data":"49bb46b9a34f7b014db97c50d9c33660947609a2fefe4de013ad9c9ef02e27e6"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.931178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerStarted","Data":"8511e6faae208bae23cd354c7cba2f9747f2bf6d9f0ee657f5506537281c2caf"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.932441 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" event={"ID":"dc8db928-4418-4963-892f-df5413ed2c76","Type":"ContainerStarted","Data":"53d2fc29aac69e45fc296a780716fa5bb9d138c810d11baa32aca32e822b9b5d"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.932470 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" event={"ID":"dc8db928-4418-4963-892f-df5413ed2c76","Type":"ContainerStarted","Data":"1574dbcf9e692cc765c3c2f83b2b4d9650fd9cd1394c09bd56a58ae679a1f9a6"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.933377 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.937196 4901 generic.go:334] "Generic (PLEG): container finished" podID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerID="c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f" exitCode=0 Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.940628 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerDied","Data":"c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.940695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerStarted","Data":"35582ec7704707298bfe5ff107b9deb2630b61845b89ca0f850439384b825fe8"} Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.989359 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" podStartSLOduration=129.989333558 podStartE2EDuration="2m9.989333558s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:04.983468479 +0000 UTC m=+152.001808575" watchObservedRunningTime="2026-02-02 10:41:04.989333558 +0000 UTC m=+152.007673654" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.996688 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.996825 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:04 crc kubenswrapper[4901]: I0202 10:41:04.996907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbts4\" (UniqueName: \"kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.023778 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.028444 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.034827 4901 patch_prober.go:28] interesting pod/console-f9d7485db-wb2m4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.035045 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wb2m4" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.038963 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-w875r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.039084 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-w875r" podUID="bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.038980 4901 patch_prober.go:28] interesting pod/downloads-7954f5f757-w875r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.039295 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-w875r" podUID="bfc9f0c4-e3f4-4d41-8304-1bcbbb6b67e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.071451 4901 patch_prober.go:28] interesting pod/router-default-5444994796-6sw49 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:41:05 crc kubenswrapper[4901]: [+]has-synced ok Feb 02 10:41:05 crc kubenswrapper[4901]: [+]process-running ok Feb 02 10:41:05 crc kubenswrapper[4901]: healthz check failed Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.071511 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6sw49" podUID="d447069d-acf2-4316-ab5b-2d2692e8f1e6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.098427 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbts4\" (UniqueName: \"kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.098859 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.099040 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.100136 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.100999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.118911 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbts4\" (UniqueName: \"kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4\") pod \"redhat-marketplace-f96jl\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.235220 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.235465 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.243330 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.248696 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.260987 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wl2tq" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.315669 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.317324 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.330752 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.403035 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.404874 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.405357 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.498900 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.507246 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.507326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.507415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.508034 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.508094 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.541731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq\") pod \"redhat-marketplace-mvpxf\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.676786 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.687767 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.840088 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.880183 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.905068 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.905883 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.909782 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.909983 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.931197 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.950238 4901 generic.go:334] "Generic (PLEG): container finished" podID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerID="633038a05b0461e44462303b109638286fe6328751042ea97137bedb4da38a79" exitCode=0 Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.950302 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerDied","Data":"633038a05b0461e44462303b109638286fe6328751042ea97137bedb4da38a79"} Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.950378 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerStarted","Data":"db0687830c52170fc811cba0bc1beac7e759b262bf33aef219c8b9d470f0a4ef"} Feb 02 10:41:05 crc kubenswrapper[4901]: I0202 10:41:05.970246 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qsh9f" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.018923 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.024007 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.070889 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.079961 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.126456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.126517 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.135529 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.172327 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.246927 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.326257 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.327267 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.328975 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.348635 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.358872 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.431514 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgw8p\" (UniqueName: \"kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.431557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.431610 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.535289 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgw8p\" (UniqueName: \"kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.535606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.535652 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.536114 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.539869 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.568587 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgw8p\" (UniqueName: \"kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p\") pod \"redhat-operators-t4pph\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.656086 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.716152 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.717198 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.730430 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.839428 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.839770 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.839831 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx2gr\" (UniqueName: \"kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.943783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx2gr\" (UniqueName: \"kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.943847 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.943934 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.944597 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.945162 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.953589 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.961921 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerID="c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937" exitCode=0 Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.962020 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerDied","Data":"c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937"} Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.962098 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerStarted","Data":"abf2121825e7cc7cca6512732a558efcc1cf3c3bb8662f4103d5dd3cbfc68273"} Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.969074 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6sw49" Feb 02 10:41:06 crc kubenswrapper[4901]: I0202 10:41:06.969607 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx2gr\" (UniqueName: \"kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr\") pod \"redhat-operators-9xg9m\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.120663 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.157591 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:07 crc kubenswrapper[4901]: W0202 10:41:07.197652 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9e94b03_bd47_46b7_8ae7_addbb2b58bb7.slice/crio-bdf3274d91f67f210c00c115822cb120623cb2173322896c248f74791fefe929 WatchSource:0}: Error finding container bdf3274d91f67f210c00c115822cb120623cb2173322896c248f74791fefe929: Status 404 returned error can't find the container with id bdf3274d91f67f210c00c115822cb120623cb2173322896c248f74791fefe929 Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.491180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:07 crc kubenswrapper[4901]: W0202 10:41:07.564943 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67734bbd_4400_4478_b63f_5eff579a1f3d.slice/crio-3f68598f91627a9c717067192b86b60006632452c911d295c819f0d90d6599d8 WatchSource:0}: Error finding container 3f68598f91627a9c717067192b86b60006632452c911d295c819f0d90d6599d8: Status 404 returned error can't find the container with id 3f68598f91627a9c717067192b86b60006632452c911d295c819f0d90d6599d8 Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.578924 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.837740 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:41:07 crc kubenswrapper[4901]: I0202 10:41:07.838063 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.008124 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerStarted","Data":"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.008187 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerStarted","Data":"3f68598f91627a9c717067192b86b60006632452c911d295c819f0d90d6599d8"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.029874 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"409abb18-6198-483b-aab5-f9ae4469d1bb","Type":"ContainerStarted","Data":"a65385cb49e74fc588cac76b4f0785c0bf0532160b730d95c3b91d34c05598df"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.029928 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"409abb18-6198-483b-aab5-f9ae4469d1bb","Type":"ContainerStarted","Data":"9583fa60c717e2f837085a24509be4f91ce884cef661787d33b5427fee67fa84"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.041049 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerID="69bda9554f9e491869fe2711ed8f7fb16351840c5507b1b56c310f574973d7cf" exitCode=0 Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.042425 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerDied","Data":"69bda9554f9e491869fe2711ed8f7fb16351840c5507b1b56c310f574973d7cf"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.042461 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerStarted","Data":"bdf3274d91f67f210c00c115822cb120623cb2173322896c248f74791fefe929"} Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.823750 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.824451 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.830013 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.837230 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.840303 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.992157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:08 crc kubenswrapper[4901]: I0202 10:41:08.992587 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.051105 4901 generic.go:334] "Generic (PLEG): container finished" podID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerID="719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da" exitCode=0 Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.051172 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerDied","Data":"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da"} Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.064839 4901 generic.go:334] "Generic (PLEG): container finished" podID="409abb18-6198-483b-aab5-f9ae4469d1bb" containerID="a65385cb49e74fc588cac76b4f0785c0bf0532160b730d95c3b91d34c05598df" exitCode=0 Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.064886 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"409abb18-6198-483b-aab5-f9ae4469d1bb","Type":"ContainerDied","Data":"a65385cb49e74fc588cac76b4f0785c0bf0532160b730d95c3b91d34c05598df"} Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.094081 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.094148 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.094205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.118123 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.184925 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:09 crc kubenswrapper[4901]: I0202 10:41:09.731551 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.111303 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9","Type":"ContainerStarted","Data":"34559cc434cdfcd342e26c8f33df8d5f42336519bf6485418c91d46fae9fd55e"} Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.524972 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.626085 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir\") pod \"409abb18-6198-483b-aab5-f9ae4469d1bb\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.626200 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access\") pod \"409abb18-6198-483b-aab5-f9ae4469d1bb\" (UID: \"409abb18-6198-483b-aab5-f9ae4469d1bb\") " Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.629225 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "409abb18-6198-483b-aab5-f9ae4469d1bb" (UID: "409abb18-6198-483b-aab5-f9ae4469d1bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.634957 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "409abb18-6198-483b-aab5-f9ae4469d1bb" (UID: "409abb18-6198-483b-aab5-f9ae4469d1bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.728897 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/409abb18-6198-483b-aab5-f9ae4469d1bb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:10 crc kubenswrapper[4901]: I0202 10:41:10.728931 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/409abb18-6198-483b-aab5-f9ae4469d1bb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.122425 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9","Type":"ContainerStarted","Data":"ccb1a6c9f7707f3092a237af016a0ce0fd7d470c5d5d45c77b776c58d73248ce"} Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.132021 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"409abb18-6198-483b-aab5-f9ae4469d1bb","Type":"ContainerDied","Data":"9583fa60c717e2f837085a24509be4f91ce884cef661787d33b5427fee67fa84"} Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.132092 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.132119 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9583fa60c717e2f837085a24509be4f91ce884cef661787d33b5427fee67fa84" Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.139383 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.139361771 podStartE2EDuration="3.139361771s" podCreationTimestamp="2026-02-02 10:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:11.135457889 +0000 UTC m=+158.153797985" watchObservedRunningTime="2026-02-02 10:41:11.139361771 +0000 UTC m=+158.157701867" Feb 02 10:41:11 crc kubenswrapper[4901]: I0202 10:41:11.146766 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-65x22" Feb 02 10:41:12 crc kubenswrapper[4901]: I0202 10:41:12.160687 4901 generic.go:334] "Generic (PLEG): container finished" podID="590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" containerID="ccb1a6c9f7707f3092a237af016a0ce0fd7d470c5d5d45c77b776c58d73248ce" exitCode=0 Feb 02 10:41:12 crc kubenswrapper[4901]: I0202 10:41:12.160797 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9","Type":"ContainerDied","Data":"ccb1a6c9f7707f3092a237af016a0ce0fd7d470c5d5d45c77b776c58d73248ce"} Feb 02 10:41:15 crc kubenswrapper[4901]: I0202 10:41:15.023009 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:41:15 crc kubenswrapper[4901]: I0202 10:41:15.030401 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:41:15 crc kubenswrapper[4901]: I0202 10:41:15.063544 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-w875r" Feb 02 10:41:18 crc kubenswrapper[4901]: I0202 10:41:18.069283 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:41:18 crc kubenswrapper[4901]: I0202 10:41:18.080395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b96d903e-a64c-4321-8963-482d4b579e30-metrics-certs\") pod \"network-metrics-daemon-fmjwg\" (UID: \"b96d903e-a64c-4321-8963-482d4b579e30\") " pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:41:18 crc kubenswrapper[4901]: I0202 10:41:18.207981 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fmjwg" Feb 02 10:41:19 crc kubenswrapper[4901]: I0202 10:41:19.834236 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:41:19 crc kubenswrapper[4901]: I0202 10:41:19.834517 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" containerID="cri-o://ef8420ea52ebc18dc7827ffa006f795a9b255185a1eb89a7606f2cab38d3fb00" gracePeriod=30 Feb 02 10:41:19 crc kubenswrapper[4901]: I0202 10:41:19.840744 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:41:19 crc kubenswrapper[4901]: I0202 10:41:19.840995 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" containerID="cri-o://0ddb8b045fab7427511e861c6bf84e55298ab3cf2035b4858b453f2862e29418" gracePeriod=30 Feb 02 10:41:20 crc kubenswrapper[4901]: I0202 10:41:20.281967 4901 generic.go:334] "Generic (PLEG): container finished" podID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerID="ef8420ea52ebc18dc7827ffa006f795a9b255185a1eb89a7606f2cab38d3fb00" exitCode=0 Feb 02 10:41:20 crc kubenswrapper[4901]: I0202 10:41:20.282057 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" event={"ID":"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc","Type":"ContainerDied","Data":"ef8420ea52ebc18dc7827ffa006f795a9b255185a1eb89a7606f2cab38d3fb00"} Feb 02 10:41:22 crc kubenswrapper[4901]: I0202 10:41:22.297768 4901 generic.go:334] "Generic (PLEG): container finished" podID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerID="0ddb8b045fab7427511e861c6bf84e55298ab3cf2035b4858b453f2862e29418" exitCode=0 Feb 02 10:41:22 crc kubenswrapper[4901]: I0202 10:41:22.297833 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" event={"ID":"fcd15108-7039-4f2c-a6be-209f7ffbbc30","Type":"ContainerDied","Data":"0ddb8b045fab7427511e861c6bf84e55298ab3cf2035b4858b453f2862e29418"} Feb 02 10:41:24 crc kubenswrapper[4901]: I0202 10:41:24.368596 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:41:24 crc kubenswrapper[4901]: I0202 10:41:24.906317 4901 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b6s7r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:41:24 crc kubenswrapper[4901]: I0202 10:41:24.906422 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:41:25 crc kubenswrapper[4901]: I0202 10:41:25.769385 4901 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mpwqj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 02 10:41:25 crc kubenswrapper[4901]: I0202 10:41:25.769751 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.715801 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.719078 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910274 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config\") pod \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910328 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles\") pod \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910374 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir\") pod \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910394 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt7vc\" (UniqueName: \"kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc\") pod \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910423 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access\") pod \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\" (UID: \"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910500 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert\") pod \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910493 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" (UID: "590eb5f0-878f-4cd1-9acd-cd1eb115a6e9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910542 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca\") pod \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\" (UID: \"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc\") " Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.910760 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.911315 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" (UID: "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.911345 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" (UID: "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.911376 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config" (OuterVolumeSpecName: "config") pod "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" (UID: "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.916679 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" (UID: "590eb5f0-878f-4cd1-9acd-cd1eb115a6e9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.919369 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc" (OuterVolumeSpecName: "kube-api-access-bt7vc") pod "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" (UID: "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc"). InnerVolumeSpecName "kube-api-access-bt7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:26 crc kubenswrapper[4901]: I0202 10:41:26.919954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" (UID: "d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012685 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012720 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt7vc\" (UniqueName: \"kubernetes.io/projected/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-kube-api-access-bt7vc\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012732 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/590eb5f0-878f-4cd1-9acd-cd1eb115a6e9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012740 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012748 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.012756 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.338867 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"590eb5f0-878f-4cd1-9acd-cd1eb115a6e9","Type":"ContainerDied","Data":"34559cc434cdfcd342e26c8f33df8d5f42336519bf6485418c91d46fae9fd55e"} Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.338914 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34559cc434cdfcd342e26c8f33df8d5f42336519bf6485418c91d46fae9fd55e" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.338949 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.340284 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" event={"ID":"d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc","Type":"ContainerDied","Data":"9eac68ee16a6efbf3891c4330bd1776997a669b9e2b095b1bb6e7f60962ffc41"} Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.340357 4901 scope.go:117] "RemoveContainer" containerID="ef8420ea52ebc18dc7827ffa006f795a9b255185a1eb89a7606f2cab38d3fb00" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.340488 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b6s7r" Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.374180 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.379672 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b6s7r"] Feb 02 10:41:27 crc kubenswrapper[4901]: I0202 10:41:27.683607 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" path="/var/lib/kubelet/pods/d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc/volumes" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.915001 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:28 crc kubenswrapper[4901]: E0202 10:41:28.915981 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.915996 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" Feb 02 10:41:28 crc kubenswrapper[4901]: E0202 10:41:28.916009 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916016 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: E0202 10:41:28.916036 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409abb18-6198-483b-aab5-f9ae4469d1bb" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916042 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="409abb18-6198-483b-aab5-f9ae4469d1bb" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916127 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce91ed-4f91-4a92-a83b-f9c6d45a81dc" containerName="controller-manager" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916139 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="409abb18-6198-483b-aab5-f9ae4469d1bb" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916148 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="590eb5f0-878f-4cd1-9acd-cd1eb115a6e9" containerName="pruner" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.916502 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.923551 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.923881 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.924043 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.924168 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.926276 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.926732 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.926882 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.928868 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.950027 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.950124 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.950149 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.950194 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh6xf\" (UniqueName: \"kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:28 crc kubenswrapper[4901]: I0202 10:41:28.950351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.051342 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.051835 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.051857 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.051897 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh6xf\" (UniqueName: \"kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.051930 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.053386 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.053703 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.054253 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.060518 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.070160 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh6xf\" (UniqueName: \"kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf\") pod \"controller-manager-7cc78794d4-mrpsv\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:29 crc kubenswrapper[4901]: I0202 10:41:29.254436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.941360 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 02 10:41:34 crc kubenswrapper[4901]: I0202 10:41:34.942210 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.942513 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56wgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b9p28_openshift-marketplace(da3848e5-a20f-4124-856b-d860bea45325): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.943770 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b9p28" podUID="da3848e5-a20f-4124-856b-d860bea45325" Feb 02 10:41:34 crc kubenswrapper[4901]: I0202 10:41:34.979021 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.979443 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" Feb 02 10:41:34 crc kubenswrapper[4901]: I0202 10:41:34.979457 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" Feb 02 10:41:34 crc kubenswrapper[4901]: I0202 10:41:34.979693 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" containerName="route-controller-manager" Feb 02 10:41:34 crc kubenswrapper[4901]: I0202 10:41:34.980266 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.988773 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.989007 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wctjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-825gs_openshift-marketplace(2327a290-e69a-4a8b-b3af-2b1f02819202): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:41:34 crc kubenswrapper[4901]: E0202 10:41:34.990770 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-825gs" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.001676 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.010165 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.010353 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tbts4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f96jl_openshift-marketplace(e26efe11-5a79-428e-9f34-ac7e0af2b5df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.012140 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f96jl" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.013755 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.026344 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgw8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t4pph_openshift-marketplace(f9e94b03-bd47-46b7-8ae7-addbb2b58bb7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.028646 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t4pph" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.102720 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca\") pod \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.102798 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config\") pod \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.102915 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjwws\" (UniqueName: \"kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws\") pod \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.102942 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert\") pod \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\" (UID: \"fcd15108-7039-4f2c-a6be-209f7ffbbc30\") " Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.103198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.103234 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.103258 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.103287 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f4j\" (UniqueName: \"kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.103770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca" (OuterVolumeSpecName: "client-ca") pod "fcd15108-7039-4f2c-a6be-209f7ffbbc30" (UID: "fcd15108-7039-4f2c-a6be-209f7ffbbc30"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.107263 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config" (OuterVolumeSpecName: "config") pod "fcd15108-7039-4f2c-a6be-209f7ffbbc30" (UID: "fcd15108-7039-4f2c-a6be-209f7ffbbc30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.119795 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fcd15108-7039-4f2c-a6be-209f7ffbbc30" (UID: "fcd15108-7039-4f2c-a6be-209f7ffbbc30"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.119958 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws" (OuterVolumeSpecName: "kube-api-access-xjwws") pod "fcd15108-7039-4f2c-a6be-209f7ffbbc30" (UID: "fcd15108-7039-4f2c-a6be-209f7ffbbc30"). InnerVolumeSpecName "kube-api-access-xjwws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.204777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208033 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208421 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f4j\" (UniqueName: \"kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208612 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208633 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208691 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjwws\" (UniqueName: \"kubernetes.io/projected/fcd15108-7039-4f2c-a6be-209f7ffbbc30-kube-api-access-xjwws\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208702 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fcd15108-7039-4f2c-a6be-209f7ffbbc30-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208713 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.208732 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd15108-7039-4f2c-a6be-209f7ffbbc30-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.209272 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.215523 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.216339 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fmjwg"] Feb 02 10:41:35 crc kubenswrapper[4901]: W0202 10:41:35.225508 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96d903e_a64c_4321_8963_482d4b579e30.slice/crio-1ce58a10075f1f2baeab575a1291b6ff5339b336a0550ec13e8cd8c341a8114e WatchSource:0}: Error finding container 1ce58a10075f1f2baeab575a1291b6ff5339b336a0550ec13e8cd8c341a8114e: Status 404 returned error can't find the container with id 1ce58a10075f1f2baeab575a1291b6ff5339b336a0550ec13e8cd8c341a8114e Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.227714 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f4j\" (UniqueName: \"kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j\") pod \"route-controller-manager-5c58478884-cs749\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.251171 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.251323 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tcrt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-79vj8_openshift-marketplace(ebacceb9-418b-4af4-9511-007595694dc2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.254359 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-79vj8" podUID="ebacceb9-418b-4af4-9511-007595694dc2" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.306684 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.357776 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.394996 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" event={"ID":"a7eda50c-0532-40e5-bcb7-717f52f6d735","Type":"ContainerStarted","Data":"1b528b563ae8a19f57ebb4f647faff052e27bdb2ece0979d6bc1dd7f6f1a1d63"} Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.400443 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerStarted","Data":"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d"} Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.409601 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerStarted","Data":"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c"} Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.415792 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerStarted","Data":"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e"} Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.431373 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" event={"ID":"fcd15108-7039-4f2c-a6be-209f7ffbbc30","Type":"ContainerDied","Data":"722220c0b79f83453ce5318b0af7043a1613d50183aaca36e4271667d55a2073"} Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.431459 4901 scope.go:117] "RemoveContainer" containerID="0ddb8b045fab7427511e861c6bf84e55298ab3cf2035b4858b453f2862e29418" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.431451 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.443113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" event={"ID":"b96d903e-a64c-4321-8963-482d4b579e30","Type":"ContainerStarted","Data":"1ce58a10075f1f2baeab575a1291b6ff5339b336a0550ec13e8cd8c341a8114e"} Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.453524 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b9p28" podUID="da3848e5-a20f-4124-856b-d860bea45325" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.454074 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-79vj8" podUID="ebacceb9-418b-4af4-9511-007595694dc2" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.454208 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-825gs" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.459198 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f96jl" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" Feb 02 10:41:35 crc kubenswrapper[4901]: E0202 10:41:35.459607 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t4pph" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.607794 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.611966 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpwqj"] Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.682792 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd15108-7039-4f2c-a6be-209f7ffbbc30" path="/var/lib/kubelet/pods/fcd15108-7039-4f2c-a6be-209f7ffbbc30/volumes" Feb 02 10:41:35 crc kubenswrapper[4901]: I0202 10:41:35.782081 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:35 crc kubenswrapper[4901]: W0202 10:41:35.788874 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ab97df_dc16_4a11_abab_c79a2a91f2c0.slice/crio-b55e0e330d798bb620708a5654427a5953cad454fe328920824f5873782c7f39 WatchSource:0}: Error finding container b55e0e330d798bb620708a5654427a5953cad454fe328920824f5873782c7f39: Status 404 returned error can't find the container with id b55e0e330d798bb620708a5654427a5953cad454fe328920824f5873782c7f39 Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.085769 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bmwhx" Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.449357 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" event={"ID":"84ab97df-dc16-4a11-abab-c79a2a91f2c0","Type":"ContainerStarted","Data":"3b559ce8598fa2ea04fbbe1fc3edf91b997e1ec028238a65304625746cc1e048"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.449436 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" event={"ID":"84ab97df-dc16-4a11-abab-c79a2a91f2c0","Type":"ContainerStarted","Data":"b55e0e330d798bb620708a5654427a5953cad454fe328920824f5873782c7f39"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.450989 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" event={"ID":"a7eda50c-0532-40e5-bcb7-717f52f6d735","Type":"ContainerStarted","Data":"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.451293 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.453450 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerID="ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d" exitCode=0 Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.453489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerDied","Data":"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.456133 4901 generic.go:334] "Generic (PLEG): container finished" podID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerID="e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c" exitCode=0 Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.456195 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerDied","Data":"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.457516 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.461076 4901 generic.go:334] "Generic (PLEG): container finished" podID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerID="320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e" exitCode=0 Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.461162 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerDied","Data":"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.471089 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" podStartSLOduration=17.470460186 podStartE2EDuration="17.470460186s" podCreationTimestamp="2026-02-02 10:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:36.463725377 +0000 UTC m=+183.482065503" watchObservedRunningTime="2026-02-02 10:41:36.470460186 +0000 UTC m=+183.488800282" Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.471194 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" event={"ID":"b96d903e-a64c-4321-8963-482d4b579e30","Type":"ContainerStarted","Data":"c78a27793b157ecce30d10102225f88031fd3d13b8c2f2261ec72535a91e24e7"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.472040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fmjwg" event={"ID":"b96d903e-a64c-4321-8963-482d4b579e30","Type":"ContainerStarted","Data":"23a89d929854afdd48eb59db08c47cee7c2f9cc658c2483c006042d661446ac8"} Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.482545 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" podStartSLOduration=17.482530751 podStartE2EDuration="17.482530751s" podCreationTimestamp="2026-02-02 10:41:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:36.482267435 +0000 UTC m=+183.500607551" watchObservedRunningTime="2026-02-02 10:41:36.482530751 +0000 UTC m=+183.500870847" Feb 02 10:41:36 crc kubenswrapper[4901]: I0202 10:41:36.573740 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fmjwg" podStartSLOduration=161.573719125 podStartE2EDuration="2m41.573719125s" podCreationTimestamp="2026-02-02 10:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:36.561775443 +0000 UTC m=+183.580115549" watchObservedRunningTime="2026-02-02 10:41:36.573719125 +0000 UTC m=+183.592059221" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.478793 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerStarted","Data":"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93"} Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.481774 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerStarted","Data":"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55"} Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.485116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerStarted","Data":"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb"} Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.486030 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.495938 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.499225 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mvpxf" podStartSLOduration=2.475375747 podStartE2EDuration="32.49920413s" podCreationTimestamp="2026-02-02 10:41:05 +0000 UTC" firstStartedPulling="2026-02-02 10:41:06.971071387 +0000 UTC m=+153.989411483" lastFinishedPulling="2026-02-02 10:41:36.99489976 +0000 UTC m=+184.013239866" observedRunningTime="2026-02-02 10:41:37.496221009 +0000 UTC m=+184.514561105" watchObservedRunningTime="2026-02-02 10:41:37.49920413 +0000 UTC m=+184.517544226" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.574285 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9xg9m" podStartSLOduration=3.551059702 podStartE2EDuration="31.574265422s" podCreationTimestamp="2026-02-02 10:41:06 +0000 UTC" firstStartedPulling="2026-02-02 10:41:09.056406972 +0000 UTC m=+156.074747068" lastFinishedPulling="2026-02-02 10:41:37.079612692 +0000 UTC m=+184.097952788" observedRunningTime="2026-02-02 10:41:37.533980181 +0000 UTC m=+184.552320277" watchObservedRunningTime="2026-02-02 10:41:37.574265422 +0000 UTC m=+184.592605518" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.576058 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2t7g" podStartSLOduration=2.595498822 podStartE2EDuration="34.576047894s" podCreationTimestamp="2026-02-02 10:41:03 +0000 UTC" firstStartedPulling="2026-02-02 10:41:04.942275397 +0000 UTC m=+151.960615493" lastFinishedPulling="2026-02-02 10:41:36.922824469 +0000 UTC m=+183.941164565" observedRunningTime="2026-02-02 10:41:37.572682915 +0000 UTC m=+184.591023011" watchObservedRunningTime="2026-02-02 10:41:37.576047894 +0000 UTC m=+184.594387990" Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.837669 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:41:37 crc kubenswrapper[4901]: I0202 10:41:37.837769 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:39 crc kubenswrapper[4901]: I0202 10:41:39.701718 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:39 crc kubenswrapper[4901]: I0202 10:41:39.701989 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" podUID="a7eda50c-0532-40e5-bcb7-717f52f6d735" containerName="controller-manager" containerID="cri-o://4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7" gracePeriod=30 Feb 02 10:41:39 crc kubenswrapper[4901]: I0202 10:41:39.800700 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.191626 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.383410 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert\") pod \"a7eda50c-0532-40e5-bcb7-717f52f6d735\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.383469 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca\") pod \"a7eda50c-0532-40e5-bcb7-717f52f6d735\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.383516 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles\") pod \"a7eda50c-0532-40e5-bcb7-717f52f6d735\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.383739 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh6xf\" (UniqueName: \"kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf\") pod \"a7eda50c-0532-40e5-bcb7-717f52f6d735\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.383792 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config\") pod \"a7eda50c-0532-40e5-bcb7-717f52f6d735\" (UID: \"a7eda50c-0532-40e5-bcb7-717f52f6d735\") " Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.384082 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7eda50c-0532-40e5-bcb7-717f52f6d735" (UID: "a7eda50c-0532-40e5-bcb7-717f52f6d735"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.384161 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a7eda50c-0532-40e5-bcb7-717f52f6d735" (UID: "a7eda50c-0532-40e5-bcb7-717f52f6d735"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.384340 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config" (OuterVolumeSpecName: "config") pod "a7eda50c-0532-40e5-bcb7-717f52f6d735" (UID: "a7eda50c-0532-40e5-bcb7-717f52f6d735"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.394858 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7eda50c-0532-40e5-bcb7-717f52f6d735" (UID: "a7eda50c-0532-40e5-bcb7-717f52f6d735"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.394882 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf" (OuterVolumeSpecName: "kube-api-access-mh6xf") pod "a7eda50c-0532-40e5-bcb7-717f52f6d735" (UID: "a7eda50c-0532-40e5-bcb7-717f52f6d735"). InnerVolumeSpecName "kube-api-access-mh6xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.485159 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7eda50c-0532-40e5-bcb7-717f52f6d735-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.485190 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.485198 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.485209 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh6xf\" (UniqueName: \"kubernetes.io/projected/a7eda50c-0532-40e5-bcb7-717f52f6d735-kube-api-access-mh6xf\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.485217 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7eda50c-0532-40e5-bcb7-717f52f6d735-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507001 4901 generic.go:334] "Generic (PLEG): container finished" podID="a7eda50c-0532-40e5-bcb7-717f52f6d735" containerID="4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7" exitCode=0 Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507063 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" event={"ID":"a7eda50c-0532-40e5-bcb7-717f52f6d735","Type":"ContainerDied","Data":"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7"} Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507148 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" podUID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" containerName="route-controller-manager" containerID="cri-o://3b559ce8598fa2ea04fbbe1fc3edf91b997e1ec028238a65304625746cc1e048" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc78794d4-mrpsv" event={"ID":"a7eda50c-0532-40e5-bcb7-717f52f6d735","Type":"ContainerDied","Data":"1b528b563ae8a19f57ebb4f647faff052e27bdb2ece0979d6bc1dd7f6f1a1d63"} Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.507214 4901 scope.go:117] "RemoveContainer" containerID="4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.527941 4901 scope.go:117] "RemoveContainer" containerID="4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7" Feb 02 10:41:40 crc kubenswrapper[4901]: E0202 10:41:40.528358 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7\": container with ID starting with 4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7 not found: ID does not exist" containerID="4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.528388 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7"} err="failed to get container status \"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7\": rpc error: code = NotFound desc = could not find container \"4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7\": container with ID starting with 4084b1793cce25dd576e3f6b6d6a29b56866ef44a94647ece70b6d02169b38a7 not found: ID does not exist" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.540528 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.543004 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cc78794d4-mrpsv"] Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.925206 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:41:40 crc kubenswrapper[4901]: E0202 10:41:40.925836 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7eda50c-0532-40e5-bcb7-717f52f6d735" containerName="controller-manager" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.925853 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7eda50c-0532-40e5-bcb7-717f52f6d735" containerName="controller-manager" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.925972 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7eda50c-0532-40e5-bcb7-717f52f6d735" containerName="controller-manager" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.926408 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.929106 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.929127 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.929113 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.929204 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.929374 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.931944 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.940651 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:41:40 crc kubenswrapper[4901]: I0202 10:41:40.945610 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.091907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.091956 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.091989 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.092052 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.092078 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr294\" (UniqueName: \"kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.193143 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.193208 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.193255 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.193282 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr294\" (UniqueName: \"kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.193321 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.194714 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.195059 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.195605 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.198242 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.209261 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr294\" (UniqueName: \"kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294\") pod \"controller-manager-6f6df6bd96-cc728\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.245887 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.438301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.524893 4901 generic.go:334] "Generic (PLEG): container finished" podID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" containerID="3b559ce8598fa2ea04fbbe1fc3edf91b997e1ec028238a65304625746cc1e048" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.525040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" event={"ID":"84ab97df-dc16-4a11-abab-c79a2a91f2c0","Type":"ContainerDied","Data":"3b559ce8598fa2ea04fbbe1fc3edf91b997e1ec028238a65304625746cc1e048"} Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.527217 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" event={"ID":"00a98dfe-865a-4464-b4ac-46c51bd3b550","Type":"ContainerStarted","Data":"ebe27540d04a7b83a06712145d138eff611bab12b750f365a4f46603ec99035f"} Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.583867 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.685763 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7eda50c-0532-40e5-bcb7-717f52f6d735" path="/var/lib/kubelet/pods/a7eda50c-0532-40e5-bcb7-717f52f6d735/volumes" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.700803 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config\") pod \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.700867 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5f4j\" (UniqueName: \"kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j\") pod \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.700970 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert\") pod \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.701065 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca\") pod \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\" (UID: \"84ab97df-dc16-4a11-abab-c79a2a91f2c0\") " Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.701885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "84ab97df-dc16-4a11-abab-c79a2a91f2c0" (UID: "84ab97df-dc16-4a11-abab-c79a2a91f2c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.701913 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config" (OuterVolumeSpecName: "config") pod "84ab97df-dc16-4a11-abab-c79a2a91f2c0" (UID: "84ab97df-dc16-4a11-abab-c79a2a91f2c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.706011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "84ab97df-dc16-4a11-abab-c79a2a91f2c0" (UID: "84ab97df-dc16-4a11-abab-c79a2a91f2c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.706170 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j" (OuterVolumeSpecName: "kube-api-access-w5f4j") pod "84ab97df-dc16-4a11-abab-c79a2a91f2c0" (UID: "84ab97df-dc16-4a11-abab-c79a2a91f2c0"). InnerVolumeSpecName "kube-api-access-w5f4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.802404 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84ab97df-dc16-4a11-abab-c79a2a91f2c0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.802484 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.802535 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ab97df-dc16-4a11-abab-c79a2a91f2c0-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4901]: I0202 10:41:41.802551 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5f4j\" (UniqueName: \"kubernetes.io/projected/84ab97df-dc16-4a11-abab-c79a2a91f2c0-kube-api-access-w5f4j\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.036344 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.536278 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" event={"ID":"84ab97df-dc16-4a11-abab-c79a2a91f2c0","Type":"ContainerDied","Data":"b55e0e330d798bb620708a5654427a5953cad454fe328920824f5873782c7f39"} Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.536322 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58478884-cs749" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.536384 4901 scope.go:117] "RemoveContainer" containerID="3b559ce8598fa2ea04fbbe1fc3edf91b997e1ec028238a65304625746cc1e048" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.537722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" event={"ID":"00a98dfe-865a-4464-b4ac-46c51bd3b550","Type":"ContainerStarted","Data":"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47"} Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.538001 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.546685 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.568030 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" podStartSLOduration=3.5679994 podStartE2EDuration="3.5679994s" podCreationTimestamp="2026-02-02 10:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:42.565350287 +0000 UTC m=+189.583690383" watchObservedRunningTime="2026-02-02 10:41:42.5679994 +0000 UTC m=+189.586339496" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.603630 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.616149 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58478884-cs749"] Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.928636 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:41:42 crc kubenswrapper[4901]: E0202 10:41:42.928937 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" containerName="route-controller-manager" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.928954 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" containerName="route-controller-manager" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.929073 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" containerName="route-controller-manager" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.929590 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.931946 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.932182 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.932406 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.932496 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.932605 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.932792 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:41:42 crc kubenswrapper[4901]: I0202 10:41:42.948445 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.019004 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.019037 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d27gq\" (UniqueName: \"kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.019081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.019103 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.120577 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.121431 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.121634 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d27gq\" (UniqueName: \"kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.121724 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.121872 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.122076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.125174 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.142125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d27gq\" (UniqueName: \"kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq\") pod \"route-controller-manager-85887d448c-6d64f\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.250122 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.631387 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:41:43 crc kubenswrapper[4901]: W0202 10:41:43.646075 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb08d9fac_2cf7_42ca_b88f_59ae1a35fc89.slice/crio-55273868d468a23c2f5477aa5ab74486e15153dd077dd0164dcb16c0cafc76da WatchSource:0}: Error finding container 55273868d468a23c2f5477aa5ab74486e15153dd077dd0164dcb16c0cafc76da: Status 404 returned error can't find the container with id 55273868d468a23c2f5477aa5ab74486e15153dd077dd0164dcb16c0cafc76da Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.686652 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ab97df-dc16-4a11-abab-c79a2a91f2c0" path="/var/lib/kubelet/pods/84ab97df-dc16-4a11-abab-c79a2a91f2c0/volumes" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.840896 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.841281 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:43 crc kubenswrapper[4901]: I0202 10:41:43.990294 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:41:44 crc kubenswrapper[4901]: I0202 10:41:44.489805 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:44 crc kubenswrapper[4901]: I0202 10:41:44.551167 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" event={"ID":"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89","Type":"ContainerStarted","Data":"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7"} Feb 02 10:41:44 crc kubenswrapper[4901]: I0202 10:41:44.551222 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" event={"ID":"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89","Type":"ContainerStarted","Data":"55273868d468a23c2f5477aa5ab74486e15153dd077dd0164dcb16c0cafc76da"} Feb 02 10:41:44 crc kubenswrapper[4901]: I0202 10:41:44.605625 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:44 crc kubenswrapper[4901]: I0202 10:41:44.718652 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.025041 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.025954 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.028278 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.028711 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.034021 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.052289 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.052346 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.153494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.153555 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.153985 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.171692 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.341237 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.559697 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.570394 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.576916 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" podStartSLOduration=6.576900215 podStartE2EDuration="6.576900215s" podCreationTimestamp="2026-02-02 10:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:45.574068368 +0000 UTC m=+192.592408484" watchObservedRunningTime="2026-02-02 10:41:45.576900215 +0000 UTC m=+192.595240311" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.685065 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.685103 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.730521 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:45 crc kubenswrapper[4901]: I0202 10:41:45.782617 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:41:46 crc kubenswrapper[4901]: I0202 10:41:46.564877 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e457c573-1893-4350-b856-f43e844902dd","Type":"ContainerStarted","Data":"b94088e2865deb5e3947e196f3e68009676dc159c7e4fde47c71d4757d690812"} Feb 02 10:41:46 crc kubenswrapper[4901]: I0202 10:41:46.565178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e457c573-1893-4350-b856-f43e844902dd","Type":"ContainerStarted","Data":"081fc8735f59f3dc59af73930f69090d4e874769a0bdc6340d39026e6eb4dd38"} Feb 02 10:41:46 crc kubenswrapper[4901]: I0202 10:41:46.565940 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2t7g" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="registry-server" containerID="cri-o://93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55" gracePeriod=2 Feb 02 10:41:46 crc kubenswrapper[4901]: I0202 10:41:46.579012 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.578964498 podStartE2EDuration="1.578964498s" podCreationTimestamp="2026-02-02 10:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:46.578124259 +0000 UTC m=+193.596464355" watchObservedRunningTime="2026-02-02 10:41:46.578964498 +0000 UTC m=+193.597304594" Feb 02 10:41:46 crc kubenswrapper[4901]: I0202 10:41:46.625861 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.158206 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.158512 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.203322 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.235740 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.292328 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdxrf\" (UniqueName: \"kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf\") pod \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.292403 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities\") pod \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.292504 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content\") pod \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\" (UID: \"81767c78-c6a0-4a68-ab07-98eeaf3e9483\") " Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.294668 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities" (OuterVolumeSpecName: "utilities") pod "81767c78-c6a0-4a68-ab07-98eeaf3e9483" (UID: "81767c78-c6a0-4a68-ab07-98eeaf3e9483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.300432 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf" (OuterVolumeSpecName: "kube-api-access-hdxrf") pod "81767c78-c6a0-4a68-ab07-98eeaf3e9483" (UID: "81767c78-c6a0-4a68-ab07-98eeaf3e9483"). InnerVolumeSpecName "kube-api-access-hdxrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.346871 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81767c78-c6a0-4a68-ab07-98eeaf3e9483" (UID: "81767c78-c6a0-4a68-ab07-98eeaf3e9483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.396339 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdxrf\" (UniqueName: \"kubernetes.io/projected/81767c78-c6a0-4a68-ab07-98eeaf3e9483-kube-api-access-hdxrf\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.396375 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.396388 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81767c78-c6a0-4a68-ab07-98eeaf3e9483-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.521860 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.591248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerStarted","Data":"78af4e1c30c97a229c5d281315a4e529f55e912cac2ddf97107a3fe1e91b3c0b"} Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.600553 4901 generic.go:334] "Generic (PLEG): container finished" podID="e457c573-1893-4350-b856-f43e844902dd" containerID="b94088e2865deb5e3947e196f3e68009676dc159c7e4fde47c71d4757d690812" exitCode=0 Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.600643 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e457c573-1893-4350-b856-f43e844902dd","Type":"ContainerDied","Data":"b94088e2865deb5e3947e196f3e68009676dc159c7e4fde47c71d4757d690812"} Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.603048 4901 generic.go:334] "Generic (PLEG): container finished" podID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerID="93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55" exitCode=0 Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.603739 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2t7g" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.605261 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerDied","Data":"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55"} Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.605307 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2t7g" event={"ID":"81767c78-c6a0-4a68-ab07-98eeaf3e9483","Type":"ContainerDied","Data":"35582ec7704707298bfe5ff107b9deb2630b61845b89ca0f850439384b825fe8"} Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.605345 4901 scope.go:117] "RemoveContainer" containerID="93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.627285 4901 scope.go:117] "RemoveContainer" containerID="e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.671341 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.671409 4901 scope.go:117] "RemoveContainer" containerID="c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.695399 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.695439 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2t7g"] Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.728715 4901 scope.go:117] "RemoveContainer" containerID="93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55" Feb 02 10:41:47 crc kubenswrapper[4901]: E0202 10:41:47.729079 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55\": container with ID starting with 93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55 not found: ID does not exist" containerID="93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.729110 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55"} err="failed to get container status \"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55\": rpc error: code = NotFound desc = could not find container \"93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55\": container with ID starting with 93e1e1bb3049665a9d7e6d3d7a7eec4764a95abcd512bc8162fc1d653f32fc55 not found: ID does not exist" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.729134 4901 scope.go:117] "RemoveContainer" containerID="e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c" Feb 02 10:41:47 crc kubenswrapper[4901]: E0202 10:41:47.729316 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c\": container with ID starting with e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c not found: ID does not exist" containerID="e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.729341 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c"} err="failed to get container status \"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c\": rpc error: code = NotFound desc = could not find container \"e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c\": container with ID starting with e2b9b7f9aa2c7a04864678ecf7fc8acb97d34947628a672cd8bd1306162d694c not found: ID does not exist" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.729356 4901 scope.go:117] "RemoveContainer" containerID="c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f" Feb 02 10:41:47 crc kubenswrapper[4901]: E0202 10:41:47.729645 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f\": container with ID starting with c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f not found: ID does not exist" containerID="c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f" Feb 02 10:41:47 crc kubenswrapper[4901]: I0202 10:41:47.729696 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f"} err="failed to get container status \"c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f\": rpc error: code = NotFound desc = could not find container \"c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f\": container with ID starting with c9b34084ad7bfd075a26d0a051be928393e8eca35f98b0888ad4e9ca82441a2f not found: ID does not exist" Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.612982 4901 generic.go:334] "Generic (PLEG): container finished" podID="ebacceb9-418b-4af4-9511-007595694dc2" containerID="78af4e1c30c97a229c5d281315a4e529f55e912cac2ddf97107a3fe1e91b3c0b" exitCode=0 Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.613054 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerDied","Data":"78af4e1c30c97a229c5d281315a4e529f55e912cac2ddf97107a3fe1e91b3c0b"} Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.616046 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerID="b26f62ae71bf37a11edab5c4a0b381ba51e287ecb1b21c74e355920dab74a911" exitCode=0 Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.616113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerDied","Data":"b26f62ae71bf37a11edab5c4a0b381ba51e287ecb1b21c74e355920dab74a911"} Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.618759 4901 generic.go:334] "Generic (PLEG): container finished" podID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerID="b5683d6c6bd8fd96dc449e255314a6b1232ab2fc99c0a70b13545e29dfc61677" exitCode=0 Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.618886 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerDied","Data":"b5683d6c6bd8fd96dc449e255314a6b1232ab2fc99c0a70b13545e29dfc61677"} Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.619327 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mvpxf" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="registry-server" containerID="cri-o://4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93" gracePeriod=2 Feb 02 10:41:48 crc kubenswrapper[4901]: I0202 10:41:48.922110 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.005345 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.113663 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.129651 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access\") pod \"e457c573-1893-4350-b856-f43e844902dd\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.129698 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir\") pod \"e457c573-1893-4350-b856-f43e844902dd\" (UID: \"e457c573-1893-4350-b856-f43e844902dd\") " Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.130052 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e457c573-1893-4350-b856-f43e844902dd" (UID: "e457c573-1893-4350-b856-f43e844902dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.143649 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e457c573-1893-4350-b856-f43e844902dd" (UID: "e457c573-1893-4350-b856-f43e844902dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.230422 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq\") pod \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.230743 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities\") pod \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.230771 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content\") pod \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\" (UID: \"5c3abe02-a443-49e5-9c6c-b2f2655c91e7\") " Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.231000 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e457c573-1893-4350-b856-f43e844902dd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.231016 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e457c573-1893-4350-b856-f43e844902dd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.233522 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities" (OuterVolumeSpecName: "utilities") pod "5c3abe02-a443-49e5-9c6c-b2f2655c91e7" (UID: "5c3abe02-a443-49e5-9c6c-b2f2655c91e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.236406 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq" (OuterVolumeSpecName: "kube-api-access-zb5xq") pod "5c3abe02-a443-49e5-9c6c-b2f2655c91e7" (UID: "5c3abe02-a443-49e5-9c6c-b2f2655c91e7"). InnerVolumeSpecName "kube-api-access-zb5xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.254409 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c3abe02-a443-49e5-9c6c-b2f2655c91e7" (UID: "5c3abe02-a443-49e5-9c6c-b2f2655c91e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.331923 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb5xq\" (UniqueName: \"kubernetes.io/projected/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-kube-api-access-zb5xq\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.331954 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.331964 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c3abe02-a443-49e5-9c6c-b2f2655c91e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.625217 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerID="4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93" exitCode=0 Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.625282 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerDied","Data":"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.625309 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mvpxf" event={"ID":"5c3abe02-a443-49e5-9c6c-b2f2655c91e7","Type":"ContainerDied","Data":"abf2121825e7cc7cca6512732a558efcc1cf3c3bb8662f4103d5dd3cbfc68273"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.625326 4901 scope.go:117] "RemoveContainer" containerID="4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.625440 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mvpxf" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.627306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e457c573-1893-4350-b856-f43e844902dd","Type":"ContainerDied","Data":"081fc8735f59f3dc59af73930f69090d4e874769a0bdc6340d39026e6eb4dd38"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.627338 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081fc8735f59f3dc59af73930f69090d4e874769a0bdc6340d39026e6eb4dd38" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.627366 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.631308 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerStarted","Data":"2652b5a9f0106eb0bd65a5493b1d10dc016955dd44fbec418c946b233fd2c9e1"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.634659 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerStarted","Data":"7a1ecb03aaab570752b91e14ea4c08e25d8e4073ede9d2f2fe123ecc2667c4a2"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.636698 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerStarted","Data":"3b2bf40a48963a3adc5bf873c4fda0486f3f68e62e3e6d5a7956aa5b4bd8d466"} Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.636837 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9xg9m" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="registry-server" containerID="cri-o://e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb" gracePeriod=2 Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.637910 4901 scope.go:117] "RemoveContainer" containerID="ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.664816 4901 scope.go:117] "RemoveContainer" containerID="c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.681857 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" path="/var/lib/kubelet/pods/81767c78-c6a0-4a68-ab07-98eeaf3e9483/volumes" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.688280 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f96jl" podStartSLOduration=2.574261383 podStartE2EDuration="45.688259402s" podCreationTimestamp="2026-02-02 10:41:04 +0000 UTC" firstStartedPulling="2026-02-02 10:41:05.968944131 +0000 UTC m=+152.987284227" lastFinishedPulling="2026-02-02 10:41:49.08294215 +0000 UTC m=+196.101282246" observedRunningTime="2026-02-02 10:41:49.662187513 +0000 UTC m=+196.680527609" watchObservedRunningTime="2026-02-02 10:41:49.688259402 +0000 UTC m=+196.706599498" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.691613 4901 scope.go:117] "RemoveContainer" containerID="4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93" Feb 02 10:41:49 crc kubenswrapper[4901]: E0202 10:41:49.692097 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93\": container with ID starting with 4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93 not found: ID does not exist" containerID="4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.692133 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93"} err="failed to get container status \"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93\": rpc error: code = NotFound desc = could not find container \"4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93\": container with ID starting with 4c894beb846deae3f5edddc5e464b8ed8c07a087ec7339dd648e4ede6e78ef93 not found: ID does not exist" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.692157 4901 scope.go:117] "RemoveContainer" containerID="ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d" Feb 02 10:41:49 crc kubenswrapper[4901]: E0202 10:41:49.693846 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d\": container with ID starting with ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d not found: ID does not exist" containerID="ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.693905 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d"} err="failed to get container status \"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d\": rpc error: code = NotFound desc = could not find container \"ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d\": container with ID starting with ba00179a417926765ff8a0df962d718b65ed6fb3026bdb750fce966e4609e18d not found: ID does not exist" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.693936 4901 scope.go:117] "RemoveContainer" containerID="c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937" Feb 02 10:41:49 crc kubenswrapper[4901]: E0202 10:41:49.694310 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937\": container with ID starting with c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937 not found: ID does not exist" containerID="c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.694343 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937"} err="failed to get container status \"c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937\": rpc error: code = NotFound desc = could not find container \"c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937\": container with ID starting with c3cb4e5cd906ab802c5e942acea66417a8d10eacb215bf3c0d958821e8b74937 not found: ID does not exist" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.705307 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t4pph" podStartSLOduration=2.631067658 podStartE2EDuration="43.705289367s" podCreationTimestamp="2026-02-02 10:41:06 +0000 UTC" firstStartedPulling="2026-02-02 10:41:08.043152054 +0000 UTC m=+155.061492140" lastFinishedPulling="2026-02-02 10:41:49.117373753 +0000 UTC m=+196.135713849" observedRunningTime="2026-02-02 10:41:49.68397697 +0000 UTC m=+196.702317086" watchObservedRunningTime="2026-02-02 10:41:49.705289367 +0000 UTC m=+196.723629463" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.705679 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-79vj8" podStartSLOduration=2.617486742 podStartE2EDuration="46.705673116s" podCreationTimestamp="2026-02-02 10:41:03 +0000 UTC" firstStartedPulling="2026-02-02 10:41:04.928290786 +0000 UTC m=+151.946630882" lastFinishedPulling="2026-02-02 10:41:49.01647716 +0000 UTC m=+196.034817256" observedRunningTime="2026-02-02 10:41:49.702371417 +0000 UTC m=+196.720711513" watchObservedRunningTime="2026-02-02 10:41:49.705673116 +0000 UTC m=+196.724013212" Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.722215 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:49 crc kubenswrapper[4901]: I0202 10:41:49.725552 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mvpxf"] Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.170193 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.266117 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities\") pod \"67734bbd-4400-4478-b63f-5eff579a1f3d\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.266188 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content\") pod \"67734bbd-4400-4478-b63f-5eff579a1f3d\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.266249 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx2gr\" (UniqueName: \"kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr\") pod \"67734bbd-4400-4478-b63f-5eff579a1f3d\" (UID: \"67734bbd-4400-4478-b63f-5eff579a1f3d\") " Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.267531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities" (OuterVolumeSpecName: "utilities") pod "67734bbd-4400-4478-b63f-5eff579a1f3d" (UID: "67734bbd-4400-4478-b63f-5eff579a1f3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.273760 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr" (OuterVolumeSpecName: "kube-api-access-wx2gr") pod "67734bbd-4400-4478-b63f-5eff579a1f3d" (UID: "67734bbd-4400-4478-b63f-5eff579a1f3d"). InnerVolumeSpecName "kube-api-access-wx2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.368784 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx2gr\" (UniqueName: \"kubernetes.io/projected/67734bbd-4400-4478-b63f-5eff579a1f3d-kube-api-access-wx2gr\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.368823 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.412759 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67734bbd-4400-4478-b63f-5eff579a1f3d" (UID: "67734bbd-4400-4478-b63f-5eff579a1f3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.470303 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67734bbd-4400-4478-b63f-5eff579a1f3d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.643831 4901 generic.go:334] "Generic (PLEG): container finished" podID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerID="e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb" exitCode=0 Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.643897 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerDied","Data":"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb"} Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.643928 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xg9m" event={"ID":"67734bbd-4400-4478-b63f-5eff579a1f3d","Type":"ContainerDied","Data":"3f68598f91627a9c717067192b86b60006632452c911d295c819f0d90d6599d8"} Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.643948 4901 scope.go:117] "RemoveContainer" containerID="e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.644107 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xg9m" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.647609 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerStarted","Data":"a73ec724f1d02e6444cd4ef65c870ceff9476bafb7c3a2361b74d51159a074d0"} Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.715637 4901 scope.go:117] "RemoveContainer" containerID="320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.725341 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.729071 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9xg9m"] Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.741543 4901 scope.go:117] "RemoveContainer" containerID="719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.756849 4901 scope.go:117] "RemoveContainer" containerID="e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb" Feb 02 10:41:50 crc kubenswrapper[4901]: E0202 10:41:50.757340 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb\": container with ID starting with e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb not found: ID does not exist" containerID="e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.757374 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb"} err="failed to get container status \"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb\": rpc error: code = NotFound desc = could not find container \"e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb\": container with ID starting with e796c96a0affeccb4c26ba11228aa5fb1e606a7a2d82e07a69b83a6175c4bfeb not found: ID does not exist" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.757397 4901 scope.go:117] "RemoveContainer" containerID="320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e" Feb 02 10:41:50 crc kubenswrapper[4901]: E0202 10:41:50.757697 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e\": container with ID starting with 320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e not found: ID does not exist" containerID="320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.757727 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e"} err="failed to get container status \"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e\": rpc error: code = NotFound desc = could not find container \"320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e\": container with ID starting with 320d9294b82c7bc3b999e91f341b6ca532165960a7e6b0e297b4b56f274ec11e not found: ID does not exist" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.757745 4901 scope.go:117] "RemoveContainer" containerID="719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da" Feb 02 10:41:50 crc kubenswrapper[4901]: E0202 10:41:50.757955 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da\": container with ID starting with 719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da not found: ID does not exist" containerID="719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da" Feb 02 10:41:50 crc kubenswrapper[4901]: I0202 10:41:50.757979 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da"} err="failed to get container status \"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da\": rpc error: code = NotFound desc = could not find container \"719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da\": container with ID starting with 719039f9edd6835371d0e4ec689c6f74c2e817d29482587f1999eac44d8233da not found: ID does not exist" Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.660411 4901 generic.go:334] "Generic (PLEG): container finished" podID="da3848e5-a20f-4124-856b-d860bea45325" containerID="a73ec724f1d02e6444cd4ef65c870ceff9476bafb7c3a2361b74d51159a074d0" exitCode=0 Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.660506 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerDied","Data":"a73ec724f1d02e6444cd4ef65c870ceff9476bafb7c3a2361b74d51159a074d0"} Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.664270 4901 generic.go:334] "Generic (PLEG): container finished" podID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerID="063e4e065580b9300e4badee78f85d84ae855c5ee8538c639ac525454a2fe470" exitCode=0 Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.664392 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerDied","Data":"063e4e065580b9300e4badee78f85d84ae855c5ee8538c639ac525454a2fe470"} Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.688349 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" path="/var/lib/kubelet/pods/5c3abe02-a443-49e5-9c6c-b2f2655c91e7/volumes" Feb 02 10:41:51 crc kubenswrapper[4901]: I0202 10:41:51.689386 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" path="/var/lib/kubelet/pods/67734bbd-4400-4478-b63f-5eff579a1f3d/volumes" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619332 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619830 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619846 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619858 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619866 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619878 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619888 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619904 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619913 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619925 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619934 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619945 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619953 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="extract-content" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619964 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619972 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.619985 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.619994 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="extract-utilities" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.620005 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620014 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: E0202 10:41:52.620027 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e457c573-1893-4350-b856-f43e844902dd" containerName="pruner" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620036 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e457c573-1893-4350-b856-f43e844902dd" containerName="pruner" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620155 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3abe02-a443-49e5-9c6c-b2f2655c91e7" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620178 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="67734bbd-4400-4478-b63f-5eff579a1f3d" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620194 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="81767c78-c6a0-4a68-ab07-98eeaf3e9483" containerName="registry-server" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620206 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e457c573-1893-4350-b856-f43e844902dd" containerName="pruner" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.620679 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.622578 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.624310 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.630843 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.673919 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerStarted","Data":"796fc86dd36d60d1a20e51f112a5112017a1475f79832965ca8696200e306532"} Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.676186 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerStarted","Data":"ddec9d21fbfa3987b3f03febfa12ccbbbb84a70824618af81ef2240826439c9a"} Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.698609 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.698917 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.699045 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.710134 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9p28" podStartSLOduration=2.483371795 podStartE2EDuration="50.710113082s" podCreationTimestamp="2026-02-02 10:41:02 +0000 UTC" firstStartedPulling="2026-02-02 10:41:03.884773384 +0000 UTC m=+150.903113480" lastFinishedPulling="2026-02-02 10:41:52.111514671 +0000 UTC m=+199.129854767" observedRunningTime="2026-02-02 10:41:52.693996319 +0000 UTC m=+199.712336415" watchObservedRunningTime="2026-02-02 10:41:52.710113082 +0000 UTC m=+199.728453198" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.710912 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-825gs" podStartSLOduration=2.5701767159999998 podStartE2EDuration="49.710906101s" podCreationTimestamp="2026-02-02 10:41:03 +0000 UTC" firstStartedPulling="2026-02-02 10:41:04.933394987 +0000 UTC m=+151.951735083" lastFinishedPulling="2026-02-02 10:41:52.074124372 +0000 UTC m=+199.092464468" observedRunningTime="2026-02-02 10:41:52.709392355 +0000 UTC m=+199.727732461" watchObservedRunningTime="2026-02-02 10:41:52.710906101 +0000 UTC m=+199.729246197" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.800789 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.801085 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.801206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.801292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.800928 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.818317 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access\") pod \"installer-9-crc\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:52 crc kubenswrapper[4901]: I0202 10:41:52.935996 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.277102 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.278845 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.361895 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:41:53 crc kubenswrapper[4901]: W0202 10:41:53.369382 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod47f5fbec_a621_4a18_94c2_86e646bcc88a.slice/crio-95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae WatchSource:0}: Error finding container 95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae: Status 404 returned error can't find the container with id 95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.452024 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.452083 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.494216 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.644107 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.644354 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.695234 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"47f5fbec-a621-4a18-94c2-86e646bcc88a","Type":"ContainerStarted","Data":"b36a19c3a6e9dce3de0448dc9d2e3801f5b48183c6d5bd0e93f5ce9a0218660e"} Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.695345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"47f5fbec-a621-4a18-94c2-86e646bcc88a","Type":"ContainerStarted","Data":"95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae"} Feb 02 10:41:53 crc kubenswrapper[4901]: I0202 10:41:53.722127 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.7221026099999999 podStartE2EDuration="1.72210261s" podCreationTimestamp="2026-02-02 10:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:53.718352972 +0000 UTC m=+200.736693068" watchObservedRunningTime="2026-02-02 10:41:53.72210261 +0000 UTC m=+200.740442726" Feb 02 10:41:54 crc kubenswrapper[4901]: I0202 10:41:54.321155 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b9p28" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="registry-server" probeResult="failure" output=< Feb 02 10:41:54 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 10:41:54 crc kubenswrapper[4901]: > Feb 02 10:41:54 crc kubenswrapper[4901]: I0202 10:41:54.694896 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-825gs" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="registry-server" probeResult="failure" output=< Feb 02 10:41:54 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 10:41:54 crc kubenswrapper[4901]: > Feb 02 10:41:55 crc kubenswrapper[4901]: I0202 10:41:55.248783 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:55 crc kubenswrapper[4901]: I0202 10:41:55.249648 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:55 crc kubenswrapper[4901]: I0202 10:41:55.303220 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:55 crc kubenswrapper[4901]: I0202 10:41:55.740690 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:41:56 crc kubenswrapper[4901]: I0202 10:41:56.656743 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:56 crc kubenswrapper[4901]: I0202 10:41:56.656812 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:56 crc kubenswrapper[4901]: I0202 10:41:56.703282 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:56 crc kubenswrapper[4901]: I0202 10:41:56.755227 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:41:59 crc kubenswrapper[4901]: I0202 10:41:59.742156 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:41:59 crc kubenswrapper[4901]: I0202 10:41:59.742647 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" podUID="00a98dfe-865a-4464-b4ac-46c51bd3b550" containerName="controller-manager" containerID="cri-o://de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47" gracePeriod=30 Feb 02 10:41:59 crc kubenswrapper[4901]: I0202 10:41:59.779884 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:41:59 crc kubenswrapper[4901]: I0202 10:41:59.780156 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" podUID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" containerName="route-controller-manager" containerID="cri-o://385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7" gracePeriod=30 Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.300149 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.338368 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436644 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr294\" (UniqueName: \"kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294\") pod \"00a98dfe-865a-4464-b4ac-46c51bd3b550\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436725 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d27gq\" (UniqueName: \"kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq\") pod \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436742 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config\") pod \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436811 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca\") pod \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436868 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles\") pod \"00a98dfe-865a-4464-b4ac-46c51bd3b550\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca\") pod \"00a98dfe-865a-4464-b4ac-46c51bd3b550\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436921 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert\") pod \"00a98dfe-865a-4464-b4ac-46c51bd3b550\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436937 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert\") pod \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\" (UID: \"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.436957 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config\") pod \"00a98dfe-865a-4464-b4ac-46c51bd3b550\" (UID: \"00a98dfe-865a-4464-b4ac-46c51bd3b550\") " Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.437744 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca" (OuterVolumeSpecName: "client-ca") pod "00a98dfe-865a-4464-b4ac-46c51bd3b550" (UID: "00a98dfe-865a-4464-b4ac-46c51bd3b550"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.437764 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "00a98dfe-865a-4464-b4ac-46c51bd3b550" (UID: "00a98dfe-865a-4464-b4ac-46c51bd3b550"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.437775 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca" (OuterVolumeSpecName: "client-ca") pod "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" (UID: "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.437847 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config" (OuterVolumeSpecName: "config") pod "00a98dfe-865a-4464-b4ac-46c51bd3b550" (UID: "00a98dfe-865a-4464-b4ac-46c51bd3b550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.437852 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config" (OuterVolumeSpecName: "config") pod "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" (UID: "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.442972 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq" (OuterVolumeSpecName: "kube-api-access-d27gq") pod "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" (UID: "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89"). InnerVolumeSpecName "kube-api-access-d27gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.442978 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00a98dfe-865a-4464-b4ac-46c51bd3b550" (UID: "00a98dfe-865a-4464-b4ac-46c51bd3b550"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.443022 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" (UID: "b08d9fac-2cf7-42ca-b88f-59ae1a35fc89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.445348 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294" (OuterVolumeSpecName: "kube-api-access-gr294") pod "00a98dfe-865a-4464-b4ac-46c51bd3b550" (UID: "00a98dfe-865a-4464-b4ac-46c51bd3b550"). InnerVolumeSpecName "kube-api-access-gr294". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538737 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d27gq\" (UniqueName: \"kubernetes.io/projected/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-kube-api-access-d27gq\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538781 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538803 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538814 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538823 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538831 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00a98dfe-865a-4464-b4ac-46c51bd3b550-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538839 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538847 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00a98dfe-865a-4464-b4ac-46c51bd3b550-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.538855 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr294\" (UniqueName: \"kubernetes.io/projected/00a98dfe-865a-4464-b4ac-46c51bd3b550-kube-api-access-gr294\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.737520 4901 generic.go:334] "Generic (PLEG): container finished" podID="00a98dfe-865a-4464-b4ac-46c51bd3b550" containerID="de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47" exitCode=0 Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.737617 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" event={"ID":"00a98dfe-865a-4464-b4ac-46c51bd3b550","Type":"ContainerDied","Data":"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47"} Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.737705 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" event={"ID":"00a98dfe-865a-4464-b4ac-46c51bd3b550","Type":"ContainerDied","Data":"ebe27540d04a7b83a06712145d138eff611bab12b750f365a4f46603ec99035f"} Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.737731 4901 scope.go:117] "RemoveContainer" containerID="de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.737643 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6df6bd96-cc728" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.740773 4901 generic.go:334] "Generic (PLEG): container finished" podID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" containerID="385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7" exitCode=0 Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.740842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" event={"ID":"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89","Type":"ContainerDied","Data":"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7"} Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.740918 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" event={"ID":"b08d9fac-2cf7-42ca-b88f-59ae1a35fc89","Type":"ContainerDied","Data":"55273868d468a23c2f5477aa5ab74486e15153dd077dd0164dcb16c0cafc76da"} Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.740921 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.761832 4901 scope.go:117] "RemoveContainer" containerID="de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47" Feb 02 10:42:00 crc kubenswrapper[4901]: E0202 10:42:00.762390 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47\": container with ID starting with de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47 not found: ID does not exist" containerID="de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.762433 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47"} err="failed to get container status \"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47\": rpc error: code = NotFound desc = could not find container \"de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47\": container with ID starting with de54881b76a9854f65a814e469b469fb2f886b62f4e88a619bb6995a0f69ab47 not found: ID does not exist" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.762462 4901 scope.go:117] "RemoveContainer" containerID="385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.771018 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.791042 4901 scope.go:117] "RemoveContainer" containerID="385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7" Feb 02 10:42:00 crc kubenswrapper[4901]: E0202 10:42:00.791654 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7\": container with ID starting with 385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7 not found: ID does not exist" containerID="385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.791713 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7"} err="failed to get container status \"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7\": rpc error: code = NotFound desc = could not find container \"385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7\": container with ID starting with 385b32205db22cff5be991dba7639b3d3cd1f39c142d641a00aac072d7874db7 not found: ID does not exist" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.795929 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f6df6bd96-cc728"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.807354 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.807421 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85887d448c-6d64f"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.942552 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:00 crc kubenswrapper[4901]: E0202 10:42:00.942802 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" containerName="route-controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.942817 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" containerName="route-controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: E0202 10:42:00.942830 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a98dfe-865a-4464-b4ac-46c51bd3b550" containerName="controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.942837 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a98dfe-865a-4464-b4ac-46c51bd3b550" containerName="controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.942954 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a98dfe-865a-4464-b4ac-46c51bd3b550" containerName="controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.942974 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" containerName="route-controller-manager" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.943407 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.945922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.946108 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.946451 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.946846 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.947901 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.948355 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.949059 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.951228 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.951731 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.951951 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.952065 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.952124 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.952640 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.952758 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.959654 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.961352 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:42:00 crc kubenswrapper[4901]: I0202 10:42:00.962660 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044070 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw9g9\" (UniqueName: \"kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044155 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044183 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42kl\" (UniqueName: \"kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044234 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044251 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044271 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.044295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.144922 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.144962 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw9g9\" (UniqueName: \"kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.144996 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145013 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42kl\" (UniqueName: \"kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145067 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145092 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145111 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.145129 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.146542 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.146624 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.146848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.147109 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.148269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.150370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.150395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.160219 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw9g9\" (UniqueName: \"kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9\") pod \"controller-manager-748b9db64b-cl92q\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.163374 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42kl\" (UniqueName: \"kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl\") pod \"route-controller-manager-b8c66cf97-s9g7p\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.261668 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.276083 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.699971 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a98dfe-865a-4464-b4ac-46c51bd3b550" path="/var/lib/kubelet/pods/00a98dfe-865a-4464-b4ac-46c51bd3b550/volumes" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.700999 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08d9fac-2cf7-42ca-b88f-59ae1a35fc89" path="/var/lib/kubelet/pods/b08d9fac-2cf7-42ca-b88f-59ae1a35fc89/volumes" Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.750498 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:01 crc kubenswrapper[4901]: W0202 10:42:01.761578 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e5bfe09_6c03_4b1b_af89_5928670990ef.slice/crio-8a576ba638ed54ee063d4d690df2d0fd9301015024c17167a28b1457caf3a8da WatchSource:0}: Error finding container 8a576ba638ed54ee063d4d690df2d0fd9301015024c17167a28b1457caf3a8da: Status 404 returned error can't find the container with id 8a576ba638ed54ee063d4d690df2d0fd9301015024c17167a28b1457caf3a8da Feb 02 10:42:01 crc kubenswrapper[4901]: I0202 10:42:01.773526 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:01 crc kubenswrapper[4901]: W0202 10:42:01.780268 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf52890_7a75_4b6d_a042_498725403221.slice/crio-a2fdee5a237118cc4e0a013847f8cab3abd39abc623f484515a5447012717144 WatchSource:0}: Error finding container a2fdee5a237118cc4e0a013847f8cab3abd39abc623f484515a5447012717144: Status 404 returned error can't find the container with id a2fdee5a237118cc4e0a013847f8cab3abd39abc623f484515a5447012717144 Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.765828 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" event={"ID":"2e5bfe09-6c03-4b1b-af89-5928670990ef","Type":"ContainerStarted","Data":"529437eb0b13b874bf49f83537b2fbd188b095ad9ba41f0929505508454b7345"} Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.766185 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.766197 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" event={"ID":"2e5bfe09-6c03-4b1b-af89-5928670990ef","Type":"ContainerStarted","Data":"8a576ba638ed54ee063d4d690df2d0fd9301015024c17167a28b1457caf3a8da"} Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.767611 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" event={"ID":"fbf52890-7a75-4b6d-a042-498725403221","Type":"ContainerStarted","Data":"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f"} Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.767672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" event={"ID":"fbf52890-7a75-4b6d-a042-498725403221","Type":"ContainerStarted","Data":"a2fdee5a237118cc4e0a013847f8cab3abd39abc623f484515a5447012717144"} Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.768056 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.776512 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.777806 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.792789 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" podStartSLOduration=3.792761861 podStartE2EDuration="3.792761861s" podCreationTimestamp="2026-02-02 10:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:02.790076948 +0000 UTC m=+209.808417044" watchObservedRunningTime="2026-02-02 10:42:02.792761861 +0000 UTC m=+209.811101957" Feb 02 10:42:02 crc kubenswrapper[4901]: I0202 10:42:02.811824 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" podStartSLOduration=3.811802724 podStartE2EDuration="3.811802724s" podCreationTimestamp="2026-02-02 10:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:02.808554297 +0000 UTC m=+209.826894413" watchObservedRunningTime="2026-02-02 10:42:02.811802724 +0000 UTC m=+209.830142820" Feb 02 10:42:03 crc kubenswrapper[4901]: I0202 10:42:03.332111 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:42:03 crc kubenswrapper[4901]: I0202 10:42:03.391648 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:42:03 crc kubenswrapper[4901]: I0202 10:42:03.505130 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:42:03 crc kubenswrapper[4901]: I0202 10:42:03.695004 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:42:03 crc kubenswrapper[4901]: I0202 10:42:03.732532 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:42:05 crc kubenswrapper[4901]: I0202 10:42:05.537326 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:42:05 crc kubenswrapper[4901]: I0202 10:42:05.537869 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-825gs" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="registry-server" containerID="cri-o://ddec9d21fbfa3987b3f03febfa12ccbbbb84a70824618af81ef2240826439c9a" gracePeriod=2 Feb 02 10:42:05 crc kubenswrapper[4901]: I0202 10:42:05.790390 4901 generic.go:334] "Generic (PLEG): container finished" podID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerID="ddec9d21fbfa3987b3f03febfa12ccbbbb84a70824618af81ef2240826439c9a" exitCode=0 Feb 02 10:42:05 crc kubenswrapper[4901]: I0202 10:42:05.790437 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerDied","Data":"ddec9d21fbfa3987b3f03febfa12ccbbbb84a70824618af81ef2240826439c9a"} Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.003017 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.027571 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wctjb\" (UniqueName: \"kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb\") pod \"2327a290-e69a-4a8b-b3af-2b1f02819202\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.027664 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content\") pod \"2327a290-e69a-4a8b-b3af-2b1f02819202\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.027696 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities\") pod \"2327a290-e69a-4a8b-b3af-2b1f02819202\" (UID: \"2327a290-e69a-4a8b-b3af-2b1f02819202\") " Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.029133 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities" (OuterVolumeSpecName: "utilities") pod "2327a290-e69a-4a8b-b3af-2b1f02819202" (UID: "2327a290-e69a-4a8b-b3af-2b1f02819202"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.040980 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb" (OuterVolumeSpecName: "kube-api-access-wctjb") pod "2327a290-e69a-4a8b-b3af-2b1f02819202" (UID: "2327a290-e69a-4a8b-b3af-2b1f02819202"). InnerVolumeSpecName "kube-api-access-wctjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.080477 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2327a290-e69a-4a8b-b3af-2b1f02819202" (UID: "2327a290-e69a-4a8b-b3af-2b1f02819202"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.129838 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wctjb\" (UniqueName: \"kubernetes.io/projected/2327a290-e69a-4a8b-b3af-2b1f02819202-kube-api-access-wctjb\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.129873 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.129888 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2327a290-e69a-4a8b-b3af-2b1f02819202-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.799933 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-825gs" event={"ID":"2327a290-e69a-4a8b-b3af-2b1f02819202","Type":"ContainerDied","Data":"8511e6faae208bae23cd354c7cba2f9747f2bf6d9f0ee657f5506537281c2caf"} Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.800005 4901 scope.go:117] "RemoveContainer" containerID="ddec9d21fbfa3987b3f03febfa12ccbbbb84a70824618af81ef2240826439c9a" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.800017 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-825gs" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.836732 4901 scope.go:117] "RemoveContainer" containerID="063e4e065580b9300e4badee78f85d84ae855c5ee8538c639ac525454a2fe470" Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.836858 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.840707 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-825gs"] Feb 02 10:42:06 crc kubenswrapper[4901]: I0202 10:42:06.856683 4901 scope.go:117] "RemoveContainer" containerID="49bb46b9a34f7b014db97c50d9c33660947609a2fefe4de013ad9c9ef02e27e6" Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.687292 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" path="/var/lib/kubelet/pods/2327a290-e69a-4a8b-b3af-2b1f02819202/volumes" Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.837755 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.837862 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.837921 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.838697 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:42:07 crc kubenswrapper[4901]: I0202 10:42:07.838777 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d" gracePeriod=600 Feb 02 10:42:08 crc kubenswrapper[4901]: I0202 10:42:08.814335 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d" exitCode=0 Feb 02 10:42:08 crc kubenswrapper[4901]: I0202 10:42:08.814407 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d"} Feb 02 10:42:08 crc kubenswrapper[4901]: I0202 10:42:08.814890 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a"} Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.019801 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" containerID="cri-o://e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2" gracePeriod=15 Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.488986 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571013 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571293 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571342 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571368 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fbqm\" (UniqueName: \"kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571436 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571458 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571505 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571580 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571666 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571696 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571752 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571691 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.571779 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data\") pod \"93a1ed7b-a791-4fb9-b02b-8280b107789a\" (UID: \"93a1ed7b-a791-4fb9-b02b-8280b107789a\") " Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.572068 4901 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.572475 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.572527 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.572630 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.572770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.577510 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.577632 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.578682 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.578915 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.579472 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.580022 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm" (OuterVolumeSpecName: "kube-api-access-2fbqm") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "kube-api-access-2fbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.581721 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.583386 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.584230 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "93a1ed7b-a791-4fb9-b02b-8280b107789a" (UID: "93a1ed7b-a791-4fb9-b02b-8280b107789a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674148 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674182 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674196 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674206 4901 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674219 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674231 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674243 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674252 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674262 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fbqm\" (UniqueName: \"kubernetes.io/projected/93a1ed7b-a791-4fb9-b02b-8280b107789a-kube-api-access-2fbqm\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674272 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674281 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674313 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.674324 4901 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93a1ed7b-a791-4fb9-b02b-8280b107789a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.822174 4901 generic.go:334] "Generic (PLEG): container finished" podID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerID="e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2" exitCode=0 Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.822231 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" event={"ID":"93a1ed7b-a791-4fb9-b02b-8280b107789a","Type":"ContainerDied","Data":"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2"} Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.822258 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.822270 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsxn7" event={"ID":"93a1ed7b-a791-4fb9-b02b-8280b107789a","Type":"ContainerDied","Data":"8fe150db1f598b0222b74f861d86824cc693c5f750a4f9535f2f6b3aedcf951b"} Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.822293 4901 scope.go:117] "RemoveContainer" containerID="e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.840508 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.843999 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsxn7"] Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.848383 4901 scope.go:117] "RemoveContainer" containerID="e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2" Feb 02 10:42:09 crc kubenswrapper[4901]: E0202 10:42:09.849012 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2\": container with ID starting with e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2 not found: ID does not exist" containerID="e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2" Feb 02 10:42:09 crc kubenswrapper[4901]: I0202 10:42:09.849057 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2"} err="failed to get container status \"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2\": rpc error: code = NotFound desc = could not find container \"e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2\": container with ID starting with e9e477df56be8955ab9401d0b9b4ac5cd805eb28e32330665f7031b3bf48b6c2 not found: ID does not exist" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.687517 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" path="/var/lib/kubelet/pods/93a1ed7b-a791-4fb9-b02b-8280b107789a/volumes" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.955292 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6"] Feb 02 10:42:11 crc kubenswrapper[4901]: E0202 10:42:11.955680 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="extract-content" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.955702 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="extract-content" Feb 02 10:42:11 crc kubenswrapper[4901]: E0202 10:42:11.955724 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="extract-utilities" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.955737 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="extract-utilities" Feb 02 10:42:11 crc kubenswrapper[4901]: E0202 10:42:11.955753 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="registry-server" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.955766 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="registry-server" Feb 02 10:42:11 crc kubenswrapper[4901]: E0202 10:42:11.955792 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.955811 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.956102 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2327a290-e69a-4a8b-b3af-2b1f02819202" containerName="registry-server" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.956133 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a1ed7b-a791-4fb9-b02b-8280b107789a" containerName="oauth-openshift" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.956756 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964373 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964478 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964488 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964540 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964384 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964703 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.964385 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.968253 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.969320 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.975371 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.977165 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.977659 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:42:11 crc kubenswrapper[4901]: I0202 10:42:11.984875 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6"] Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.001534 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.011814 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.011920 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012016 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012076 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012125 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012169 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012248 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5v9v\" (UniqueName: \"kubernetes.io/projected/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-kube-api-access-h5v9v\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012385 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-policies\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.012473 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.023805 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-dir\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.023954 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-session\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.024004 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.024067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.024476 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.032416 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.125913 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.125966 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-dir\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.125992 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-session\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126015 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126077 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126104 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126152 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126181 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126187 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-dir\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.126213 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.128371 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.128484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5v9v\" (UniqueName: \"kubernetes.io/projected/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-kube-api-access-h5v9v\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.128634 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.128712 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-policies\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.129164 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.129319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.129924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-audit-policies\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.130634 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.136386 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.136421 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-error\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.136964 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-login\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.138785 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-session\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.139157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.139957 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.141078 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.146704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.153115 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5v9v\" (UniqueName: \"kubernetes.io/projected/ff01f112-5fac-4d88-9a76-c1f9eec6ffd6-kube-api-access-h5v9v\") pod \"oauth-openshift-6d4bd77db6-vfkw6\" (UID: \"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6\") " pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.295744 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.784994 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6"] Feb 02 10:42:12 crc kubenswrapper[4901]: W0202 10:42:12.797796 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff01f112_5fac_4d88_9a76_c1f9eec6ffd6.slice/crio-47f4c788d2b9984f0c29704bb3e44403c470acbc19558add76d32e1a65405211 WatchSource:0}: Error finding container 47f4c788d2b9984f0c29704bb3e44403c470acbc19558add76d32e1a65405211: Status 404 returned error can't find the container with id 47f4c788d2b9984f0c29704bb3e44403c470acbc19558add76d32e1a65405211 Feb 02 10:42:12 crc kubenswrapper[4901]: I0202 10:42:12.849109 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" event={"ID":"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6","Type":"ContainerStarted","Data":"47f4c788d2b9984f0c29704bb3e44403c470acbc19558add76d32e1a65405211"} Feb 02 10:42:13 crc kubenswrapper[4901]: I0202 10:42:13.860046 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" event={"ID":"ff01f112-5fac-4d88-9a76-c1f9eec6ffd6","Type":"ContainerStarted","Data":"6b74c24238f40ad1472bfd97b20dd6a6e9ff8b3744dcd11bab523fa06bd0c986"} Feb 02 10:42:13 crc kubenswrapper[4901]: I0202 10:42:13.860857 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:13 crc kubenswrapper[4901]: I0202 10:42:13.868531 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" Feb 02 10:42:13 crc kubenswrapper[4901]: I0202 10:42:13.953243 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6d4bd77db6-vfkw6" podStartSLOduration=29.953190534 podStartE2EDuration="29.953190534s" podCreationTimestamp="2026-02-02 10:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:13.904122128 +0000 UTC m=+220.922462314" watchObservedRunningTime="2026-02-02 10:42:13.953190534 +0000 UTC m=+220.971530670" Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.761611 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.762889 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" podUID="2e5bfe09-6c03-4b1b-af89-5928670990ef" containerName="controller-manager" containerID="cri-o://529437eb0b13b874bf49f83537b2fbd188b095ad9ba41f0929505508454b7345" gracePeriod=30 Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.867065 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.867326 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" podUID="fbf52890-7a75-4b6d-a042-498725403221" containerName="route-controller-manager" containerID="cri-o://6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f" gracePeriod=30 Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.918933 4901 generic.go:334] "Generic (PLEG): container finished" podID="2e5bfe09-6c03-4b1b-af89-5928670990ef" containerID="529437eb0b13b874bf49f83537b2fbd188b095ad9ba41f0929505508454b7345" exitCode=0 Feb 02 10:42:19 crc kubenswrapper[4901]: I0202 10:42:19.918990 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" event={"ID":"2e5bfe09-6c03-4b1b-af89-5928670990ef","Type":"ContainerDied","Data":"529437eb0b13b874bf49f83537b2fbd188b095ad9ba41f0929505508454b7345"} Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.370387 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.421523 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563059 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config\") pod \"fbf52890-7a75-4b6d-a042-498725403221\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563111 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert\") pod \"fbf52890-7a75-4b6d-a042-498725403221\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563132 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca\") pod \"fbf52890-7a75-4b6d-a042-498725403221\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563191 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert\") pod \"2e5bfe09-6c03-4b1b-af89-5928670990ef\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563234 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw9g9\" (UniqueName: \"kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9\") pod \"2e5bfe09-6c03-4b1b-af89-5928670990ef\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563257 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config\") pod \"2e5bfe09-6c03-4b1b-af89-5928670990ef\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563275 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca\") pod \"2e5bfe09-6c03-4b1b-af89-5928670990ef\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563307 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42kl\" (UniqueName: \"kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl\") pod \"fbf52890-7a75-4b6d-a042-498725403221\" (UID: \"fbf52890-7a75-4b6d-a042-498725403221\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.563382 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles\") pod \"2e5bfe09-6c03-4b1b-af89-5928670990ef\" (UID: \"2e5bfe09-6c03-4b1b-af89-5928670990ef\") " Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.564787 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2e5bfe09-6c03-4b1b-af89-5928670990ef" (UID: "2e5bfe09-6c03-4b1b-af89-5928670990ef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.564885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e5bfe09-6c03-4b1b-af89-5928670990ef" (UID: "2e5bfe09-6c03-4b1b-af89-5928670990ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.564944 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config" (OuterVolumeSpecName: "config") pod "2e5bfe09-6c03-4b1b-af89-5928670990ef" (UID: "2e5bfe09-6c03-4b1b-af89-5928670990ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.565484 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config" (OuterVolumeSpecName: "config") pod "fbf52890-7a75-4b6d-a042-498725403221" (UID: "fbf52890-7a75-4b6d-a042-498725403221"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.565616 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca" (OuterVolumeSpecName: "client-ca") pod "fbf52890-7a75-4b6d-a042-498725403221" (UID: "fbf52890-7a75-4b6d-a042-498725403221"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.570605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e5bfe09-6c03-4b1b-af89-5928670990ef" (UID: "2e5bfe09-6c03-4b1b-af89-5928670990ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.572310 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fbf52890-7a75-4b6d-a042-498725403221" (UID: "fbf52890-7a75-4b6d-a042-498725403221"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.572725 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9" (OuterVolumeSpecName: "kube-api-access-jw9g9") pod "2e5bfe09-6c03-4b1b-af89-5928670990ef" (UID: "2e5bfe09-6c03-4b1b-af89-5928670990ef"). InnerVolumeSpecName "kube-api-access-jw9g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.572753 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl" (OuterVolumeSpecName: "kube-api-access-c42kl") pod "fbf52890-7a75-4b6d-a042-498725403221" (UID: "fbf52890-7a75-4b6d-a042-498725403221"). InnerVolumeSpecName "kube-api-access-c42kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.665960 4901 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666034 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666059 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf52890-7a75-4b6d-a042-498725403221-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666077 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf52890-7a75-4b6d-a042-498725403221-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666095 4901 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e5bfe09-6c03-4b1b-af89-5928670990ef-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666119 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw9g9\" (UniqueName: \"kubernetes.io/projected/2e5bfe09-6c03-4b1b-af89-5928670990ef-kube-api-access-jw9g9\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666144 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666173 4901 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e5bfe09-6c03-4b1b-af89-5928670990ef-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.666196 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42kl\" (UniqueName: \"kubernetes.io/projected/fbf52890-7a75-4b6d-a042-498725403221-kube-api-access-c42kl\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.929427 4901 generic.go:334] "Generic (PLEG): container finished" podID="fbf52890-7a75-4b6d-a042-498725403221" containerID="6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f" exitCode=0 Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.929507 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" event={"ID":"fbf52890-7a75-4b6d-a042-498725403221","Type":"ContainerDied","Data":"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f"} Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.929539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" event={"ID":"fbf52890-7a75-4b6d-a042-498725403221","Type":"ContainerDied","Data":"a2fdee5a237118cc4e0a013847f8cab3abd39abc623f484515a5447012717144"} Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.929575 4901 scope.go:117] "RemoveContainer" containerID="6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.929627 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.931884 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" event={"ID":"2e5bfe09-6c03-4b1b-af89-5928670990ef","Type":"ContainerDied","Data":"8a576ba638ed54ee063d4d690df2d0fd9301015024c17167a28b1457caf3a8da"} Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.931921 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748b9db64b-cl92q" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.950775 4901 scope.go:117] "RemoveContainer" containerID="6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f" Feb 02 10:42:20 crc kubenswrapper[4901]: E0202 10:42:20.951801 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f\": container with ID starting with 6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f not found: ID does not exist" containerID="6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.951855 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f"} err="failed to get container status \"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f\": rpc error: code = NotFound desc = could not find container \"6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f\": container with ID starting with 6bdd5b3bfa16a0ce99e20c7d1d91fc697fec5cf9c740c233f2d758c5160f4a4f not found: ID does not exist" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.951893 4901 scope.go:117] "RemoveContainer" containerID="529437eb0b13b874bf49f83537b2fbd188b095ad9ba41f0929505508454b7345" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.960485 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f478bb69-b9qdf"] Feb 02 10:42:20 crc kubenswrapper[4901]: E0202 10:42:20.960741 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5bfe09-6c03-4b1b-af89-5928670990ef" containerName="controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.960762 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5bfe09-6c03-4b1b-af89-5928670990ef" containerName="controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: E0202 10:42:20.960772 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf52890-7a75-4b6d-a042-498725403221" containerName="route-controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.960778 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf52890-7a75-4b6d-a042-498725403221" containerName="route-controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.960889 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5bfe09-6c03-4b1b-af89-5928670990ef" containerName="controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.960901 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf52890-7a75-4b6d-a042-498725403221" containerName="route-controller-manager" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.961411 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.965687 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.966240 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.967686 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.969183 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.970644 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.972408 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.973845 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz"] Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.974875 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-proxy-ca-bundles\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.974976 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94c9b13c-502e-4894-9ca5-3b716039a879-serving-cert\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.974909 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.975263 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-config\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.975345 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crd7l\" (UniqueName: \"kubernetes.io/projected/94c9b13c-502e-4894-9ca5-3b716039a879-kube-api-access-crd7l\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.975819 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-client-ca\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.979453 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.980106 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.980543 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.982174 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.994194 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.994361 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:42:20 crc kubenswrapper[4901]: I0202 10:42:20.994718 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.002584 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.021737 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f478bb69-b9qdf"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.050266 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.053836 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-748b9db64b-cl92q"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.056330 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.059049 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b8c66cf97-s9g7p"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.077821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e2a5b9-6daf-40a6-a15c-964c446aa784-serving-cert\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.077899 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-client-ca\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.077942 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-proxy-ca-bundles\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.077963 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94c9b13c-502e-4894-9ca5-3b716039a879-serving-cert\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.078000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s445\" (UniqueName: \"kubernetes.io/projected/a6e2a5b9-6daf-40a6-a15c-964c446aa784-kube-api-access-7s445\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.078024 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-client-ca\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.078254 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-config\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.078277 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-config\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.079745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-client-ca\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.079796 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-config\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.079873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crd7l\" (UniqueName: \"kubernetes.io/projected/94c9b13c-502e-4894-9ca5-3b716039a879-kube-api-access-crd7l\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.080983 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94c9b13c-502e-4894-9ca5-3b716039a879-proxy-ca-bundles\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.084690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94c9b13c-502e-4894-9ca5-3b716039a879-serving-cert\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.102916 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crd7l\" (UniqueName: \"kubernetes.io/projected/94c9b13c-502e-4894-9ca5-3b716039a879-kube-api-access-crd7l\") pod \"controller-manager-5f478bb69-b9qdf\" (UID: \"94c9b13c-502e-4894-9ca5-3b716039a879\") " pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.181850 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-config\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.181925 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e2a5b9-6daf-40a6-a15c-964c446aa784-serving-cert\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.182034 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s445\" (UniqueName: \"kubernetes.io/projected/a6e2a5b9-6daf-40a6-a15c-964c446aa784-kube-api-access-7s445\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.182058 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-client-ca\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.183400 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-client-ca\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.184549 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e2a5b9-6daf-40a6-a15c-964c446aa784-config\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.187557 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e2a5b9-6daf-40a6-a15c-964c446aa784-serving-cert\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.200957 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s445\" (UniqueName: \"kubernetes.io/projected/a6e2a5b9-6daf-40a6-a15c-964c446aa784-kube-api-access-7s445\") pod \"route-controller-manager-78695c776c-gcgsz\" (UID: \"a6e2a5b9-6daf-40a6-a15c-964c446aa784\") " pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.301751 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.312784 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.685390 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5bfe09-6c03-4b1b-af89-5928670990ef" path="/var/lib/kubelet/pods/2e5bfe09-6c03-4b1b-af89-5928670990ef/volumes" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.686818 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf52890-7a75-4b6d-a042-498725403221" path="/var/lib/kubelet/pods/fbf52890-7a75-4b6d-a042-498725403221/volumes" Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.809734 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f478bb69-b9qdf"] Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.859314 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz"] Feb 02 10:42:21 crc kubenswrapper[4901]: W0202 10:42:21.869847 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e2a5b9_6daf_40a6_a15c_964c446aa784.slice/crio-4825eec493a962aed92467c39b1b0662349f5f0a20ad8ef59b6617d544d069dd WatchSource:0}: Error finding container 4825eec493a962aed92467c39b1b0662349f5f0a20ad8ef59b6617d544d069dd: Status 404 returned error can't find the container with id 4825eec493a962aed92467c39b1b0662349f5f0a20ad8ef59b6617d544d069dd Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.946446 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" event={"ID":"a6e2a5b9-6daf-40a6-a15c-964c446aa784","Type":"ContainerStarted","Data":"4825eec493a962aed92467c39b1b0662349f5f0a20ad8ef59b6617d544d069dd"} Feb 02 10:42:21 crc kubenswrapper[4901]: I0202 10:42:21.955349 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" event={"ID":"94c9b13c-502e-4894-9ca5-3b716039a879","Type":"ContainerStarted","Data":"eabb3bbaddeb89a23e46636ee3c5a6775e2794b7e1aa959f0c93cb97283792e1"} Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.968613 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" event={"ID":"94c9b13c-502e-4894-9ca5-3b716039a879","Type":"ContainerStarted","Data":"f4921fccf3c2119ae7dedbf85da4ca5dfe8afee358dfbf887567f7bb21574590"} Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.969122 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.971085 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" event={"ID":"a6e2a5b9-6daf-40a6-a15c-964c446aa784","Type":"ContainerStarted","Data":"b404fa4ce83360ee5e6ecaf38830fe928e10560f9a09bebf36a71a4cfb7871d7"} Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.971386 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.976152 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.978255 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" Feb 02 10:42:22 crc kubenswrapper[4901]: I0202 10:42:22.995372 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f478bb69-b9qdf" podStartSLOduration=3.9953550399999997 podStartE2EDuration="3.99535504s" podCreationTimestamp="2026-02-02 10:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:22.993160807 +0000 UTC m=+230.011500903" watchObservedRunningTime="2026-02-02 10:42:22.99535504 +0000 UTC m=+230.013695136" Feb 02 10:42:23 crc kubenswrapper[4901]: I0202 10:42:23.031151 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78695c776c-gcgsz" podStartSLOduration=4.03111549 podStartE2EDuration="4.03111549s" podCreationTimestamp="2026-02-02 10:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:23.027080804 +0000 UTC m=+230.045420940" watchObservedRunningTime="2026-02-02 10:42:23.03111549 +0000 UTC m=+230.049455626" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.391292 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.393912 4901 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394086 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394512 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b" gracePeriod=15 Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394667 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096" gracePeriod=15 Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394680 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e" gracePeriod=15 Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394641 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0" gracePeriod=15 Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.394846 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4" gracePeriod=15 Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395182 4901 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.395385 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395397 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.395412 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395419 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.395434 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395441 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.395450 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395456 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.395469 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.395477 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.396018 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.396035 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.396045 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.396054 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402133 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402533 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402559 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402638 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402669 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.402688 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.493538 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538358 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538414 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538460 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538494 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538529 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538615 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.538652 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.539022 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644305 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644375 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644402 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644444 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644473 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644492 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644578 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644605 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644583 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644665 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644743 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644898 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.644922 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: I0202 10:42:31.766428 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:42:31 crc kubenswrapper[4901]: W0202 10:42:31.791875 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-66ed6a24e139092031a88ca91d7b66e61c5784328d611d654618290113f0a67e WatchSource:0}: Error finding container 66ed6a24e139092031a88ca91d7b66e61c5784328d611d654618290113f0a67e: Status 404 returned error can't find the container with id 66ed6a24e139092031a88ca91d7b66e61c5784328d611d654618290113f0a67e Feb 02 10:42:31 crc kubenswrapper[4901]: E0202 10:42:31.796609 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189067f421e879bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,LastTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.023816 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"66ed6a24e139092031a88ca91d7b66e61c5784328d611d654618290113f0a67e"} Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.027516 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.028706 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.029319 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096" exitCode=0 Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.029344 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4" exitCode=0 Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.029352 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e" exitCode=0 Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.029359 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0" exitCode=2 Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.029408 4901 scope.go:117] "RemoveContainer" containerID="ffc0687911a1d2830f37c118e98e6a87851e11f72570dde89bd24704e41ee595" Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.031781 4901 generic.go:334] "Generic (PLEG): container finished" podID="47f5fbec-a621-4a18-94c2-86e646bcc88a" containerID="b36a19c3a6e9dce3de0448dc9d2e3801f5b48183c6d5bd0e93f5ce9a0218660e" exitCode=0 Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.031818 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"47f5fbec-a621-4a18-94c2-86e646bcc88a","Type":"ContainerDied","Data":"b36a19c3a6e9dce3de0448dc9d2e3801f5b48183c6d5bd0e93f5ce9a0218660e"} Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.032522 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:32 crc kubenswrapper[4901]: I0202 10:42:32.034051 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.040717 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.044821 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61"} Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.045047 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.045273 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.347513 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.348869 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.349304 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368287 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access\") pod \"47f5fbec-a621-4a18-94c2-86e646bcc88a\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368350 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir\") pod \"47f5fbec-a621-4a18-94c2-86e646bcc88a\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368415 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock\") pod \"47f5fbec-a621-4a18-94c2-86e646bcc88a\" (UID: \"47f5fbec-a621-4a18-94c2-86e646bcc88a\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368537 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47f5fbec-a621-4a18-94c2-86e646bcc88a" (UID: "47f5fbec-a621-4a18-94c2-86e646bcc88a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368633 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock" (OuterVolumeSpecName: "var-lock") pod "47f5fbec-a621-4a18-94c2-86e646bcc88a" (UID: "47f5fbec-a621-4a18-94c2-86e646bcc88a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368756 4901 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.368767 4901 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47f5fbec-a621-4a18-94c2-86e646bcc88a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.373841 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47f5fbec-a621-4a18-94c2-86e646bcc88a" (UID: "47f5fbec-a621-4a18-94c2-86e646bcc88a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.469982 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47f5fbec-a621-4a18-94c2-86e646bcc88a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.496442 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.496689 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.496905 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.497282 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.497635 4901 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.497663 4901 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.497933 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="200ms" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.602090 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189067f421e879bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,LastTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.678416 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.678646 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: E0202 10:42:33.698864 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="400ms" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.800390 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.801159 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.802335 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.802793 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.803051 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873349 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873425 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873458 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873493 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873498 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873602 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873783 4901 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873797 4901 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:33 crc kubenswrapper[4901]: I0202 10:42:33.873805 4901 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.051850 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.051853 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"47f5fbec-a621-4a18-94c2-86e646bcc88a","Type":"ContainerDied","Data":"95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae"} Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.051945 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95006e8bf3835bed01b5533e215f04200be74eac9b4cdec23711879a69c905ae" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.056702 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.057179 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.057699 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.058442 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.059338 4901 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b" exitCode=0 Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.059444 4901 scope.go:117] "RemoveContainer" containerID="e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.059760 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.079284 4901 scope.go:117] "RemoveContainer" containerID="136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.094516 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.094858 4901 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.095169 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.100136 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="800ms" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.105973 4901 scope.go:117] "RemoveContainer" containerID="d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.120855 4901 scope.go:117] "RemoveContainer" containerID="7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.148583 4901 scope.go:117] "RemoveContainer" containerID="acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.166293 4901 scope.go:117] "RemoveContainer" containerID="fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.189823 4901 scope.go:117] "RemoveContainer" containerID="e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.190324 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\": container with ID starting with e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096 not found: ID does not exist" containerID="e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.190388 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096"} err="failed to get container status \"e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\": rpc error: code = NotFound desc = could not find container \"e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096\": container with ID starting with e272bcf010df3ea76723036d6a0af40acd55c73d368befa33c43ea0c2985c096 not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.190418 4901 scope.go:117] "RemoveContainer" containerID="136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.190884 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\": container with ID starting with 136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4 not found: ID does not exist" containerID="136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.190923 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4"} err="failed to get container status \"136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\": rpc error: code = NotFound desc = could not find container \"136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4\": container with ID starting with 136b624ddd2b8b8863e1661b80adbf21bec37586739ab44e84a91de7545fcda4 not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.190950 4901 scope.go:117] "RemoveContainer" containerID="d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.191337 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\": container with ID starting with d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e not found: ID does not exist" containerID="d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.191375 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e"} err="failed to get container status \"d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\": rpc error: code = NotFound desc = could not find container \"d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e\": container with ID starting with d9171816db2dbdcc7e0e27cdb5c07680fbbdf3829ed50a0f07fec74e60cc192e not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.191389 4901 scope.go:117] "RemoveContainer" containerID="7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.194367 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\": container with ID starting with 7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0 not found: ID does not exist" containerID="7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.194414 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0"} err="failed to get container status \"7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\": rpc error: code = NotFound desc = could not find container \"7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0\": container with ID starting with 7f36c12056892626dc879f2c9e6953fb49c19bf5b560a3b73ec62fa4040612d0 not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.194431 4901 scope.go:117] "RemoveContainer" containerID="acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.195072 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\": container with ID starting with acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b not found: ID does not exist" containerID="acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.195111 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b"} err="failed to get container status \"acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\": rpc error: code = NotFound desc = could not find container \"acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b\": container with ID starting with acf3778615c274568105cea7da893b5514fa6ab1f05dc3848f8d42219363d86b not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.195127 4901 scope.go:117] "RemoveContainer" containerID="fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.195552 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\": container with ID starting with fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7 not found: ID does not exist" containerID="fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7" Feb 02 10:42:34 crc kubenswrapper[4901]: I0202 10:42:34.195645 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7"} err="failed to get container status \"fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\": rpc error: code = NotFound desc = could not find container \"fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7\": container with ID starting with fab71ba7a59542c87d4a28f2d9ed7ddc94660e5537e7e9475922038f1489f7b7 not found: ID does not exist" Feb 02 10:42:34 crc kubenswrapper[4901]: E0202 10:42:34.900748 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="1.6s" Feb 02 10:42:35 crc kubenswrapper[4901]: I0202 10:42:35.689057 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 02 10:42:36 crc kubenswrapper[4901]: E0202 10:42:36.501333 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="3.2s" Feb 02 10:42:39 crc kubenswrapper[4901]: E0202 10:42:39.704380 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="6.4s" Feb 02 10:42:43 crc kubenswrapper[4901]: E0202 10:42:43.604512 4901 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.227:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189067f421e879bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,LastTimestamp:2026-02-02 10:42:31.795169723 +0000 UTC m=+238.813509819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:42:43 crc kubenswrapper[4901]: I0202 10:42:43.684213 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:43 crc kubenswrapper[4901]: I0202 10:42:43.684859 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.133843 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.134741 4901 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a" exitCode=1 Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.134853 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a"} Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.135833 4901 scope.go:117] "RemoveContainer" containerID="0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a" Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.136219 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.136853 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:44 crc kubenswrapper[4901]: I0202 10:42:44.137554 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.156130 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.156257 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99d645644c1a238c6c9d95798e86c864d2db3934455a3866bc6175099a6fce54"} Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.157970 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.158504 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.159132 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.676355 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.680280 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.680959 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.681719 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.697373 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.697414 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:45 crc kubenswrapper[4901]: E0202 10:42:45.697930 4901 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:45 crc kubenswrapper[4901]: I0202 10:42:45.698546 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:45 crc kubenswrapper[4901]: W0202 10:42:45.726304 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fc40ac183443f8a781c11cb74b845480a2925e16309522f18a90ad6f50d6ee1b WatchSource:0}: Error finding container fc40ac183443f8a781c11cb74b845480a2925e16309522f18a90ad6f50d6ee1b: Status 404 returned error can't find the container with id fc40ac183443f8a781c11cb74b845480a2925e16309522f18a90ad6f50d6ee1b Feb 02 10:42:46 crc kubenswrapper[4901]: E0202 10:42:46.105032 4901 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.227:6443: connect: connection refused" interval="7s" Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.164663 4901 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4a945c9654a46ca876e9966c8639930cbea6710119a09df2816e373434c36b13" exitCode=0 Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.164718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4a945c9654a46ca876e9966c8639930cbea6710119a09df2816e373434c36b13"} Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.164761 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc40ac183443f8a781c11cb74b845480a2925e16309522f18a90ad6f50d6ee1b"} Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.165052 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.165068 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.165679 4901 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:46 crc kubenswrapper[4901]: E0202 10:42:46.165690 4901 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.165855 4901 status_manager.go:851] "Failed to get status for pod" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:46 crc kubenswrapper[4901]: I0202 10:42:46.166025 4901 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.227:6443: connect: connection refused" Feb 02 10:42:47 crc kubenswrapper[4901]: I0202 10:42:47.172663 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4646d586d72ee2a4604fe4a2527c1ecb1a0fc15b20af2bca2f3041f7dfe58e7"} Feb 02 10:42:47 crc kubenswrapper[4901]: I0202 10:42:47.172702 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c07131059671bc1372eaa165238d3acc4db9df944819249b177909ee1516559f"} Feb 02 10:42:47 crc kubenswrapper[4901]: I0202 10:42:47.172713 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0efecae7bfd9a2ffb6c1ee3d6d9d9b41c70a74579fbfc10ae9153fe6c54f13d2"} Feb 02 10:42:47 crc kubenswrapper[4901]: I0202 10:42:47.172722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6e94643117161a626c41e614c6ee65623fa7000bbed9bdda5e70485d9e0be59c"} Feb 02 10:42:48 crc kubenswrapper[4901]: I0202 10:42:48.180889 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3539c790f55cff08cbe3993f9fe4a8c5a7ff60f9955ff16588a80789743a76fd"} Feb 02 10:42:48 crc kubenswrapper[4901]: I0202 10:42:48.181252 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:48 crc kubenswrapper[4901]: I0202 10:42:48.181152 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:48 crc kubenswrapper[4901]: I0202 10:42:48.181277 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.284533 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.284763 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.286066 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.699139 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.699226 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:50 crc kubenswrapper[4901]: I0202 10:42:50.703803 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:52 crc kubenswrapper[4901]: I0202 10:42:52.277758 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:42:53 crc kubenswrapper[4901]: I0202 10:42:53.190246 4901 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:53 crc kubenswrapper[4901]: I0202 10:42:53.211950 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:53 crc kubenswrapper[4901]: I0202 10:42:53.211980 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:53 crc kubenswrapper[4901]: I0202 10:42:53.215537 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:42:53 crc kubenswrapper[4901]: I0202 10:42:53.694475 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="79fa1a9b-f6a8-40eb-a542-af9631314316" Feb 02 10:42:54 crc kubenswrapper[4901]: I0202 10:42:54.216886 4901 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:54 crc kubenswrapper[4901]: I0202 10:42:54.216916 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="9a1308eb-aead-41f7-b2b1-c92d80970304" Feb 02 10:42:54 crc kubenswrapper[4901]: I0202 10:42:54.220499 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="79fa1a9b-f6a8-40eb-a542-af9631314316" Feb 02 10:43:00 crc kubenswrapper[4901]: I0202 10:43:00.284858 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 02 10:43:00 crc kubenswrapper[4901]: I0202 10:43:00.285460 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 02 10:43:03 crc kubenswrapper[4901]: I0202 10:43:03.209969 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:43:03 crc kubenswrapper[4901]: I0202 10:43:03.518524 4901 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:43:03 crc kubenswrapper[4901]: I0202 10:43:03.899762 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.189768 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.224103 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.256409 4901 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.258695 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.285878 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.433169 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.468122 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.672394 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.676626 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.688423 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.725273 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.899642 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.928364 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.959801 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:43:04 crc kubenswrapper[4901]: I0202 10:43:04.975262 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.030513 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.056703 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.085622 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.133138 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.219591 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.224797 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.309530 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.513757 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.553860 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.613286 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:43:05 crc kubenswrapper[4901]: I0202 10:43:05.987393 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.048352 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.078880 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.109815 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.114480 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.184078 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.232118 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.301985 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.420936 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.504946 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.556352 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.597837 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.609053 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.693231 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.764801 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.766041 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.807720 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.831621 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.848753 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.859047 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:43:06 crc kubenswrapper[4901]: I0202 10:43:06.945318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.002200 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.006340 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.025335 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.053376 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.110342 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.158706 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.183687 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.277103 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.340025 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.502842 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.535724 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.544483 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.570691 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.661893 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.663886 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.677347 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.741191 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.742220 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.768375 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.796642 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.812404 4901 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.817288 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.817261773 podStartE2EDuration="36.817261773s" podCreationTimestamp="2026-02-02 10:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:53.205186844 +0000 UTC m=+260.223526940" watchObservedRunningTime="2026-02-02 10:43:07.817261773 +0000 UTC m=+274.835601879" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.819547 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.819655 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.826299 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.849754 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.849730744 podStartE2EDuration="14.849730744s" podCreationTimestamp="2026-02-02 10:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:43:07.847132943 +0000 UTC m=+274.865473049" watchObservedRunningTime="2026-02-02 10:43:07.849730744 +0000 UTC m=+274.868070840" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.953243 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.963845 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:43:07 crc kubenswrapper[4901]: I0202 10:43:07.988962 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.172738 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.262436 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.282144 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.293261 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.312444 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.336970 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.362057 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.375004 4901 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.410230 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.429924 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.493871 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.731093 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.782815 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.862819 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.887549 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.925836 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:43:08 crc kubenswrapper[4901]: I0202 10:43:08.931301 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.007879 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.056381 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.117052 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.196219 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.327946 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.365203 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.391938 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.480090 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.484054 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.528885 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.663869 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.719109 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.762121 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.822506 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.912732 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.975120 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:43:09 crc kubenswrapper[4901]: I0202 10:43:09.980902 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.020294 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.147696 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.163929 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.165491 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.189823 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.196376 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.285244 4901 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.285334 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.285423 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.286344 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"99d645644c1a238c6c9d95798e86c864d2db3934455a3866bc6175099a6fce54"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.286540 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://99d645644c1a238c6c9d95798e86c864d2db3934455a3866bc6175099a6fce54" gracePeriod=30 Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.423686 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.435999 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.481189 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.515548 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.516150 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.575907 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.644172 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.689696 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.750375 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.867349 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.920044 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.937458 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.960133 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:43:10 crc kubenswrapper[4901]: I0202 10:43:10.985716 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.050064 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.097024 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.169060 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.201150 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.205505 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.262895 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.270193 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.314013 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.353203 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.358803 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.397525 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.401237 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.483507 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.484713 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.577112 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.659950 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.667496 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.752795 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:43:11 crc kubenswrapper[4901]: I0202 10:43:11.863590 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.079896 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.254488 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.294930 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.303508 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.312336 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.451461 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.497339 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.524389 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.620704 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.695299 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.819763 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.867346 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.872467 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:43:12 crc kubenswrapper[4901]: I0202 10:43:12.895405 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.059101 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.114706 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.209340 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.248550 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.324119 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.327348 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.343609 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.356553 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.400262 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.435205 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.833766 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.934152 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.949778 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:43:13 crc kubenswrapper[4901]: I0202 10:43:13.999798 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.023289 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.070484 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.111181 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.124754 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.174032 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.218641 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.229667 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.273295 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.403301 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.473297 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.582162 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.606396 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.641928 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.695218 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.698261 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.698483 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.699817 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.748169 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.756386 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.778987 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.861204 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:43:14 crc kubenswrapper[4901]: I0202 10:43:14.925190 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.003473 4901 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.057410 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.068182 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.091101 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.168927 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.176603 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.183191 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.206345 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.220866 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.229129 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.422659 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.432354 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.500288 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.595874 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.794797 4901 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.795371 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61" gracePeriod=5 Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.827917 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.916938 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:43:15 crc kubenswrapper[4901]: I0202 10:43:15.983475 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.054069 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.054980 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.074890 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.136786 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.189331 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.301918 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.430103 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.545925 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.557608 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.604249 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.751101 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:43:16 crc kubenswrapper[4901]: I0202 10:43:16.822431 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.047332 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.051355 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.141917 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.200893 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.306129 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.320815 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.382812 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.633993 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.653370 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:43:17 crc kubenswrapper[4901]: I0202 10:43:17.767978 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.026148 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.101123 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.258982 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.367140 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.388193 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.471993 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.545821 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.566689 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.574190 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.587669 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.606679 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.734901 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.976626 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.976998 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-79vj8" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="registry-server" containerID="cri-o://7a1ecb03aaab570752b91e14ea4c08e25d8e4073ede9d2f2fe123ecc2667c4a2" gracePeriod=30 Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.994100 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.994811 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.995134 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9p28" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="registry-server" containerID="cri-o://796fc86dd36d60d1a20e51f112a5112017a1475f79832965ca8696200e306532" gracePeriod=30 Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.996162 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" containerID="cri-o://f4a90d2563fdd75501b02d64e77fce3d73a2c3f793f2de5d873bb0cf568d680a" gracePeriod=30 Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.998148 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:43:18 crc kubenswrapper[4901]: I0202 10:43:18.998452 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f96jl" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="registry-server" containerID="cri-o://2652b5a9f0106eb0bd65a5493b1d10dc016955dd44fbec418c946b233fd2c9e1" gracePeriod=30 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.000963 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.001216 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t4pph" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="registry-server" containerID="cri-o://3b2bf40a48963a3adc5bf873c4fda0486f3f68e62e3e6d5a7956aa5b4bd8d466" gracePeriod=30 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.153350 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.385838 4901 generic.go:334] "Generic (PLEG): container finished" podID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerID="2652b5a9f0106eb0bd65a5493b1d10dc016955dd44fbec418c946b233fd2c9e1" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.386074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerDied","Data":"2652b5a9f0106eb0bd65a5493b1d10dc016955dd44fbec418c946b233fd2c9e1"} Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.388017 4901 generic.go:334] "Generic (PLEG): container finished" podID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerID="f4a90d2563fdd75501b02d64e77fce3d73a2c3f793f2de5d873bb0cf568d680a" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.388071 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" event={"ID":"4697c668-acff-4c8d-b562-e6491a9cbdd0","Type":"ContainerDied","Data":"f4a90d2563fdd75501b02d64e77fce3d73a2c3f793f2de5d873bb0cf568d680a"} Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.399884 4901 generic.go:334] "Generic (PLEG): container finished" podID="ebacceb9-418b-4af4-9511-007595694dc2" containerID="7a1ecb03aaab570752b91e14ea4c08e25d8e4073ede9d2f2fe123ecc2667c4a2" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.399985 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerDied","Data":"7a1ecb03aaab570752b91e14ea4c08e25d8e4073ede9d2f2fe123ecc2667c4a2"} Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.402380 4901 generic.go:334] "Generic (PLEG): container finished" podID="da3848e5-a20f-4124-856b-d860bea45325" containerID="796fc86dd36d60d1a20e51f112a5112017a1475f79832965ca8696200e306532" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.402437 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerDied","Data":"796fc86dd36d60d1a20e51f112a5112017a1475f79832965ca8696200e306532"} Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.404975 4901 generic.go:334] "Generic (PLEG): container finished" podID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerID="3b2bf40a48963a3adc5bf873c4fda0486f3f68e62e3e6d5a7956aa5b4bd8d466" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.405041 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerDied","Data":"3b2bf40a48963a3adc5bf873c4fda0486f3f68e62e3e6d5a7956aa5b4bd8d466"} Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.419625 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.463980 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities\") pod \"ebacceb9-418b-4af4-9511-007595694dc2\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.464047 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcrt8\" (UniqueName: \"kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8\") pod \"ebacceb9-418b-4af4-9511-007595694dc2\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.464085 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content\") pod \"ebacceb9-418b-4af4-9511-007595694dc2\" (UID: \"ebacceb9-418b-4af4-9511-007595694dc2\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.465354 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities" (OuterVolumeSpecName: "utilities") pod "ebacceb9-418b-4af4-9511-007595694dc2" (UID: "ebacceb9-418b-4af4-9511-007595694dc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.490951 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8" (OuterVolumeSpecName: "kube-api-access-tcrt8") pod "ebacceb9-418b-4af4-9511-007595694dc2" (UID: "ebacceb9-418b-4af4-9511-007595694dc2"). InnerVolumeSpecName "kube-api-access-tcrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.516899 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebacceb9-418b-4af4-9511-007595694dc2" (UID: "ebacceb9-418b-4af4-9511-007595694dc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.529115 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.531667 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.538001 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.564982 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm6c7\" (UniqueName: \"kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7\") pod \"4697c668-acff-4c8d-b562-e6491a9cbdd0\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565017 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbts4\" (UniqueName: \"kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4\") pod \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565082 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics\") pod \"4697c668-acff-4c8d-b562-e6491a9cbdd0\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565101 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content\") pod \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565116 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities\") pod \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\" (UID: \"e26efe11-5a79-428e-9f34-ac7e0af2b5df\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565162 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca\") pod \"4697c668-acff-4c8d-b562-e6491a9cbdd0\" (UID: \"4697c668-acff-4c8d-b562-e6491a9cbdd0\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565343 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565366 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcrt8\" (UniqueName: \"kubernetes.io/projected/ebacceb9-418b-4af4-9511-007595694dc2-kube-api-access-tcrt8\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.565376 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebacceb9-418b-4af4-9511-007595694dc2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.566371 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities" (OuterVolumeSpecName: "utilities") pod "e26efe11-5a79-428e-9f34-ac7e0af2b5df" (UID: "e26efe11-5a79-428e-9f34-ac7e0af2b5df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.568483 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7" (OuterVolumeSpecName: "kube-api-access-hm6c7") pod "4697c668-acff-4c8d-b562-e6491a9cbdd0" (UID: "4697c668-acff-4c8d-b562-e6491a9cbdd0"). InnerVolumeSpecName "kube-api-access-hm6c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.568624 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4" (OuterVolumeSpecName: "kube-api-access-tbts4") pod "e26efe11-5a79-428e-9f34-ac7e0af2b5df" (UID: "e26efe11-5a79-428e-9f34-ac7e0af2b5df"). InnerVolumeSpecName "kube-api-access-tbts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.569781 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4697c668-acff-4c8d-b562-e6491a9cbdd0" (UID: "4697c668-acff-4c8d-b562-e6491a9cbdd0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.570254 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4697c668-acff-4c8d-b562-e6491a9cbdd0" (UID: "4697c668-acff-4c8d-b562-e6491a9cbdd0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.572164 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.575455 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.591331 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e26efe11-5a79-428e-9f34-ac7e0af2b5df" (UID: "e26efe11-5a79-428e-9f34-ac7e0af2b5df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.596528 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666542 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities\") pod \"da3848e5-a20f-4124-856b-d860bea45325\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666632 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56wgf\" (UniqueName: \"kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf\") pod \"da3848e5-a20f-4124-856b-d860bea45325\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666677 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content\") pod \"da3848e5-a20f-4124-856b-d860bea45325\" (UID: \"da3848e5-a20f-4124-856b-d860bea45325\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666708 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgw8p\" (UniqueName: \"kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p\") pod \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666752 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content\") pod \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666774 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities\") pod \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\" (UID: \"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7\") " Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666933 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm6c7\" (UniqueName: \"kubernetes.io/projected/4697c668-acff-4c8d-b562-e6491a9cbdd0-kube-api-access-hm6c7\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666948 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbts4\" (UniqueName: \"kubernetes.io/projected/e26efe11-5a79-428e-9f34-ac7e0af2b5df-kube-api-access-tbts4\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666960 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666974 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666985 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26efe11-5a79-428e-9f34-ac7e0af2b5df-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.666996 4901 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4697c668-acff-4c8d-b562-e6491a9cbdd0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.667727 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities" (OuterVolumeSpecName: "utilities") pod "da3848e5-a20f-4124-856b-d860bea45325" (UID: "da3848e5-a20f-4124-856b-d860bea45325"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.667794 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities" (OuterVolumeSpecName: "utilities") pod "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" (UID: "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.670057 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p" (OuterVolumeSpecName: "kube-api-access-cgw8p") pod "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" (UID: "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7"). InnerVolumeSpecName "kube-api-access-cgw8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.670168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf" (OuterVolumeSpecName: "kube-api-access-56wgf") pod "da3848e5-a20f-4124-856b-d860bea45325" (UID: "da3848e5-a20f-4124-856b-d860bea45325"). InnerVolumeSpecName "kube-api-access-56wgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.738301 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da3848e5-a20f-4124-856b-d860bea45325" (UID: "da3848e5-a20f-4124-856b-d860bea45325"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.769476 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.769588 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgw8p\" (UniqueName: \"kubernetes.io/projected/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-kube-api-access-cgw8p\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.769631 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.769646 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3848e5-a20f-4124-856b-d860bea45325-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.769660 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56wgf\" (UniqueName: \"kubernetes.io/projected/da3848e5-a20f-4124-856b-d860bea45325-kube-api-access-56wgf\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.810448 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" (UID: "f9e94b03-bd47-46b7-8ae7-addbb2b58bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:19 crc kubenswrapper[4901]: I0202 10:43:19.870539 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.093324 4901 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.419027 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-79vj8" event={"ID":"ebacceb9-418b-4af4-9511-007595694dc2","Type":"ContainerDied","Data":"53be6de38fff7f905a38af64cf46a8058746290eab9a2fe65039ef8d7dcab75c"} Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.419094 4901 scope.go:117] "RemoveContainer" containerID="7a1ecb03aaab570752b91e14ea4c08e25d8e4073ede9d2f2fe123ecc2667c4a2" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.419095 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-79vj8" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.423332 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9p28" event={"ID":"da3848e5-a20f-4124-856b-d860bea45325","Type":"ContainerDied","Data":"163212c7f0c81016eef0d0f8d18f9e60e6794b4afe27eb9308ccf440596f2df9"} Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.423441 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9p28" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.448220 4901 scope.go:117] "RemoveContainer" containerID="78af4e1c30c97a229c5d281315a4e529f55e912cac2ddf97107a3fe1e91b3c0b" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.449900 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4pph" event={"ID":"f9e94b03-bd47-46b7-8ae7-addbb2b58bb7","Type":"ContainerDied","Data":"bdf3274d91f67f210c00c115822cb120623cb2173322896c248f74791fefe929"} Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.450086 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4pph" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.453085 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.455846 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f96jl" event={"ID":"e26efe11-5a79-428e-9f34-ac7e0af2b5df","Type":"ContainerDied","Data":"db0687830c52170fc811cba0bc1beac7e759b262bf33aef219c8b9d470f0a4ef"} Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.456075 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f96jl" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.458619 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" event={"ID":"4697c668-acff-4c8d-b562-e6491a9cbdd0","Type":"ContainerDied","Data":"bb6368ba91564b0943b7f7762b12f4c9b95cbce65424727e8fc6c6ae672ad326"} Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.458860 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xnv4q" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.461367 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-79vj8"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.465261 4901 scope.go:117] "RemoveContainer" containerID="e58405f3c5f037c020c15bdf8125eeb272a81339343153720f17b3c34ec4a4e4" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.500181 4901 scope.go:117] "RemoveContainer" containerID="796fc86dd36d60d1a20e51f112a5112017a1475f79832965ca8696200e306532" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.534369 4901 scope.go:117] "RemoveContainer" containerID="a73ec724f1d02e6444cd4ef65c870ceff9476bafb7c3a2361b74d51159a074d0" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.545065 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.550659 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xnv4q"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.559835 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.572397 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f96jl"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.576996 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.578537 4901 scope.go:117] "RemoveContainer" containerID="d536775a9b671cb7b2ed51f77b81b3b401bf6a1cb9da261acc1efcdae8d6a3e6" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.581544 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9p28"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.584419 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.587393 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t4pph"] Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.600604 4901 scope.go:117] "RemoveContainer" containerID="3b2bf40a48963a3adc5bf873c4fda0486f3f68e62e3e6d5a7956aa5b4bd8d466" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.613996 4901 scope.go:117] "RemoveContainer" containerID="b26f62ae71bf37a11edab5c4a0b381ba51e287ecb1b21c74e355920dab74a911" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.631199 4901 scope.go:117] "RemoveContainer" containerID="69bda9554f9e491869fe2711ed8f7fb16351840c5507b1b56c310f574973d7cf" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.646784 4901 scope.go:117] "RemoveContainer" containerID="2652b5a9f0106eb0bd65a5493b1d10dc016955dd44fbec418c946b233fd2c9e1" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.669546 4901 scope.go:117] "RemoveContainer" containerID="b5683d6c6bd8fd96dc449e255314a6b1232ab2fc99c0a70b13545e29dfc61677" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.688390 4901 scope.go:117] "RemoveContainer" containerID="633038a05b0461e44462303b109638286fe6328751042ea97137bedb4da38a79" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.705335 4901 scope.go:117] "RemoveContainer" containerID="f4a90d2563fdd75501b02d64e77fce3d73a2c3f793f2de5d873bb0cf568d680a" Feb 02 10:43:20 crc kubenswrapper[4901]: I0202 10:43:20.728110 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.387395 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.387464 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.466665 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.466979 4901 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61" exitCode=137 Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.467029 4901 scope.go:117] "RemoveContainer" containerID="f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.467102 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.481617 4901 scope.go:117] "RemoveContainer" containerID="f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61" Feb 02 10:43:21 crc kubenswrapper[4901]: E0202 10:43:21.481944 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61\": container with ID starting with f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61 not found: ID does not exist" containerID="f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.481974 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61"} err="failed to get container status \"f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61\": rpc error: code = NotFound desc = could not find container \"f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61\": container with ID starting with f7350befb8da6c113e415c485278b4cd1e760a8daa95bcc96d6b590979a3ab61 not found: ID does not exist" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494387 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494411 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494429 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494463 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494500 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494659 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494724 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494748 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.494840 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.501849 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.595695 4901 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.595731 4901 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.595741 4901 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.595749 4901 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.595759 4901 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.689590 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" path="/var/lib/kubelet/pods/4697c668-acff-4c8d-b562-e6491a9cbdd0/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.690292 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3848e5-a20f-4124-856b-d860bea45325" path="/var/lib/kubelet/pods/da3848e5-a20f-4124-856b-d860bea45325/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.690962 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" path="/var/lib/kubelet/pods/e26efe11-5a79-428e-9f34-ac7e0af2b5df/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.692185 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebacceb9-418b-4af4-9511-007595694dc2" path="/var/lib/kubelet/pods/ebacceb9-418b-4af4-9511-007595694dc2/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.692736 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.693195 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" path="/var/lib/kubelet/pods/f9e94b03-bd47-46b7-8ae7-addbb2b58bb7/volumes" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.694072 4901 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.712613 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.712692 4901 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="811edc7e-0d0d-4f30-91c3-86d3d9650e6c" Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.717212 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:43:21 crc kubenswrapper[4901]: I0202 10:43:21.717257 4901 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="811edc7e-0d0d-4f30-91c3-86d3d9650e6c" Feb 02 10:43:33 crc kubenswrapper[4901]: I0202 10:43:33.416696 4901 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 02 10:43:40 crc kubenswrapper[4901]: I0202 10:43:40.588309 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 02 10:43:40 crc kubenswrapper[4901]: I0202 10:43:40.592205 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:43:40 crc kubenswrapper[4901]: I0202 10:43:40.592261 4901 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="99d645644c1a238c6c9d95798e86c864d2db3934455a3866bc6175099a6fce54" exitCode=137 Feb 02 10:43:40 crc kubenswrapper[4901]: I0202 10:43:40.592296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"99d645644c1a238c6c9d95798e86c864d2db3934455a3866bc6175099a6fce54"} Feb 02 10:43:40 crc kubenswrapper[4901]: I0202 10:43:40.592334 4901 scope.go:117] "RemoveContainer" containerID="0ca71b145bba06e046005ed8f559f9e44628618d42f32aa0c129ce544714350a" Feb 02 10:43:41 crc kubenswrapper[4901]: I0202 10:43:41.598930 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 02 10:43:41 crc kubenswrapper[4901]: I0202 10:43:41.600280 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0fcd9c2a1ce91bd7007eb5cbf1f9f12dacc3dc5051def5f1f13e874d728eb49"} Feb 02 10:43:42 crc kubenswrapper[4901]: I0202 10:43:42.277940 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:43:50 crc kubenswrapper[4901]: I0202 10:43:50.285002 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:43:50 crc kubenswrapper[4901]: I0202 10:43:50.291023 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:43:52 crc kubenswrapper[4901]: I0202 10:43:52.282483 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.371088 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b9c2g"] Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372037 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372049 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372057 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372065 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372072 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372082 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372089 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372101 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372107 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372115 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372122 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372132 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372138 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372148 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372154 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372166 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372173 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372197 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372205 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372214 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" containerName="installer" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372220 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" containerName="installer" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372231 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372238 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372248 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372256 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="extract-content" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372271 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372278 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: E0202 10:43:57.372289 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372297 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="extract-utilities" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372395 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4697c668-acff-4c8d-b562-e6491a9cbdd0" containerName="marketplace-operator" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372408 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3848e5-a20f-4124-856b-d860bea45325" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372418 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26efe11-5a79-428e-9f34-ac7e0af2b5df" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372427 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebacceb9-418b-4af4-9511-007595694dc2" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372440 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372449 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f5fbec-a621-4a18-94c2-86e646bcc88a" containerName="installer" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372459 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e94b03-bd47-46b7-8ae7-addbb2b58bb7" containerName="registry-server" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.372905 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.375521 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.375746 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.375796 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.375993 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mkx5h"] Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.376046 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.376704 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.382301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b9c2g"] Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.387303 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.405718 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mkx5h"] Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556547 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7379cd-5663-4416-b263-898f6a9ef954-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556629 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556659 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cznl\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-kube-api-access-6cznl\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556677 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556696 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4zq\" (UniqueName: \"kubernetes.io/projected/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-kube-api-access-wm4zq\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556727 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7379cd-5663-4416-b263-898f6a9ef954-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556817 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-registry-tls\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556903 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-bound-sa-token\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.556967 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-trusted-ca\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.557027 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-registry-certificates\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.557093 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.581363 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7379cd-5663-4416-b263-898f6a9ef954-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658674 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-registry-tls\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658701 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-bound-sa-token\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658726 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-trusted-ca\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658760 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-registry-certificates\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.658796 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.659028 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7379cd-5663-4416-b263-898f6a9ef954-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.659159 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cznl\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-kube-api-access-6cznl\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.659179 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.659207 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4zq\" (UniqueName: \"kubernetes.io/projected/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-kube-api-access-wm4zq\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.660197 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7379cd-5663-4416-b263-898f6a9ef954-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.660509 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-registry-certificates\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.660656 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7379cd-5663-4416-b263-898f6a9ef954-trusted-ca\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.660979 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.666100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-registry-tls\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.666152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.667317 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7379cd-5663-4416-b263-898f6a9ef954-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.675289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-bound-sa-token\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.675593 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cznl\" (UniqueName: \"kubernetes.io/projected/0c7379cd-5663-4416-b263-898f6a9ef954-kube-api-access-6cznl\") pod \"image-registry-66df7c8f76-mkx5h\" (UID: \"0c7379cd-5663-4416-b263-898f6a9ef954\") " pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.685331 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4zq\" (UniqueName: \"kubernetes.io/projected/eaf0b215-2c7f-4cf7-9682-983acfa5ccb3-kube-api-access-wm4zq\") pod \"marketplace-operator-79b997595-b9c2g\" (UID: \"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3\") " pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.692275 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.699254 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.965196 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-b9c2g"] Feb 02 10:43:57 crc kubenswrapper[4901]: I0202 10:43:57.998738 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mkx5h"] Feb 02 10:43:58 crc kubenswrapper[4901]: W0202 10:43:58.008104 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7379cd_5663_4416_b263_898f6a9ef954.slice/crio-e2aea7ed52b529d9b0c44a0802199f244285e43d828d16cc846088a03cbdbe20 WatchSource:0}: Error finding container e2aea7ed52b529d9b0c44a0802199f244285e43d828d16cc846088a03cbdbe20: Status 404 returned error can't find the container with id e2aea7ed52b529d9b0c44a0802199f244285e43d828d16cc846088a03cbdbe20 Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.707585 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" event={"ID":"0c7379cd-5663-4416-b263-898f6a9ef954","Type":"ContainerStarted","Data":"7571aa3c00a334c3a8f79b6eca3c66860cdcd7343509079a71107518d46d56b0"} Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.708307 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" event={"ID":"0c7379cd-5663-4416-b263-898f6a9ef954","Type":"ContainerStarted","Data":"e2aea7ed52b529d9b0c44a0802199f244285e43d828d16cc846088a03cbdbe20"} Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.708323 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.710152 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" event={"ID":"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3","Type":"ContainerStarted","Data":"8c23700ea4a831d7474ea1035933092edb2a0b17772f869dfe5dae4bef033a9c"} Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.710193 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" event={"ID":"eaf0b215-2c7f-4cf7-9682-983acfa5ccb3","Type":"ContainerStarted","Data":"680e0cf1f23a1949ef936863f59cb12bd5b62c9f6a1b37eb7779745ae280fb37"} Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.711067 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.716839 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.734741 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" podStartSLOduration=1.7347177889999998 podStartE2EDuration="1.734717789s" podCreationTimestamp="2026-02-02 10:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:43:58.730794016 +0000 UTC m=+325.749134112" watchObservedRunningTime="2026-02-02 10:43:58.734717789 +0000 UTC m=+325.753057895" Feb 02 10:43:58 crc kubenswrapper[4901]: I0202 10:43:58.746475 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-b9c2g" podStartSLOduration=1.7464552580000001 podStartE2EDuration="1.746455258s" podCreationTimestamp="2026-02-02 10:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:43:58.745127307 +0000 UTC m=+325.763467423" watchObservedRunningTime="2026-02-02 10:43:58.746455258 +0000 UTC m=+325.764795354" Feb 02 10:44:17 crc kubenswrapper[4901]: I0202 10:44:17.709836 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mkx5h" Feb 02 10:44:17 crc kubenswrapper[4901]: I0202 10:44:17.790612 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.837190 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.838070 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.977201 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g7pnc"] Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.979783 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.985630 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:44:37 crc kubenswrapper[4901]: I0202 10:44:37.988917 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g7pnc"] Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.078023 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-utilities\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.078077 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7zm\" (UniqueName: \"kubernetes.io/projected/86b5a97c-571b-418c-af97-26b04dec66ac-kube-api-access-vc7zm\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.078411 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-catalog-content\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.162168 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4fbj"] Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.165208 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.169603 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.173158 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4fbj"] Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.180214 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-catalog-content\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.180254 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-utilities\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.180277 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7zm\" (UniqueName: \"kubernetes.io/projected/86b5a97c-571b-418c-af97-26b04dec66ac-kube-api-access-vc7zm\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.180753 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-catalog-content\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.180831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b5a97c-571b-418c-af97-26b04dec66ac-utilities\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.220731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7zm\" (UniqueName: \"kubernetes.io/projected/86b5a97c-571b-418c-af97-26b04dec66ac-kube-api-access-vc7zm\") pod \"certified-operators-g7pnc\" (UID: \"86b5a97c-571b-418c-af97-26b04dec66ac\") " pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.282395 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-catalog-content\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.282466 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-utilities\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.282490 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbz88\" (UniqueName: \"kubernetes.io/projected/a3aac041-08a3-4d4e-aada-69ada2387b41-kube-api-access-tbz88\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.305052 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.384268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-catalog-content\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.384365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-utilities\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.384416 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbz88\" (UniqueName: \"kubernetes.io/projected/a3aac041-08a3-4d4e-aada-69ada2387b41-kube-api-access-tbz88\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.384848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-catalog-content\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.385249 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3aac041-08a3-4d4e-aada-69ada2387b41-utilities\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.420610 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbz88\" (UniqueName: \"kubernetes.io/projected/a3aac041-08a3-4d4e-aada-69ada2387b41-kube-api-access-tbz88\") pod \"redhat-marketplace-j4fbj\" (UID: \"a3aac041-08a3-4d4e-aada-69ada2387b41\") " pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.504886 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.675827 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4fbj"] Feb 02 10:44:38 crc kubenswrapper[4901]: W0202 10:44:38.679532 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3aac041_08a3_4d4e_aada_69ada2387b41.slice/crio-74ab12dc98a7e9b794a683dd9e0752f2ed24edf3476ec102ab5fb5153e23686e WatchSource:0}: Error finding container 74ab12dc98a7e9b794a683dd9e0752f2ed24edf3476ec102ab5fb5153e23686e: Status 404 returned error can't find the container with id 74ab12dc98a7e9b794a683dd9e0752f2ed24edf3476ec102ab5fb5153e23686e Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.722147 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g7pnc"] Feb 02 10:44:38 crc kubenswrapper[4901]: W0202 10:44:38.735611 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86b5a97c_571b_418c_af97_26b04dec66ac.slice/crio-6db0573d52adc8a542031ebbd6509ca30e246519b6c7bbb4c7760e1bf1b205b5 WatchSource:0}: Error finding container 6db0573d52adc8a542031ebbd6509ca30e246519b6c7bbb4c7760e1bf1b205b5: Status 404 returned error can't find the container with id 6db0573d52adc8a542031ebbd6509ca30e246519b6c7bbb4c7760e1bf1b205b5 Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.971879 4901 generic.go:334] "Generic (PLEG): container finished" podID="86b5a97c-571b-418c-af97-26b04dec66ac" containerID="fea2c28f357c5c6f878f2bac0bd9f27dfd959765903f19b2d186ddb7f4103821" exitCode=0 Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.972026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7pnc" event={"ID":"86b5a97c-571b-418c-af97-26b04dec66ac","Type":"ContainerDied","Data":"fea2c28f357c5c6f878f2bac0bd9f27dfd959765903f19b2d186ddb7f4103821"} Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.972518 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7pnc" event={"ID":"86b5a97c-571b-418c-af97-26b04dec66ac","Type":"ContainerStarted","Data":"6db0573d52adc8a542031ebbd6509ca30e246519b6c7bbb4c7760e1bf1b205b5"} Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.975791 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3aac041-08a3-4d4e-aada-69ada2387b41" containerID="17fe71f77385744c4079e239cc982de8930d16f41a69c4cccc49902c5a730a26" exitCode=0 Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.975844 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4fbj" event={"ID":"a3aac041-08a3-4d4e-aada-69ada2387b41","Type":"ContainerDied","Data":"17fe71f77385744c4079e239cc982de8930d16f41a69c4cccc49902c5a730a26"} Feb 02 10:44:38 crc kubenswrapper[4901]: I0202 10:44:38.975873 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4fbj" event={"ID":"a3aac041-08a3-4d4e-aada-69ada2387b41","Type":"ContainerStarted","Data":"74ab12dc98a7e9b794a683dd9e0752f2ed24edf3476ec102ab5fb5153e23686e"} Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.759554 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95qfs"] Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.761751 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.765676 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.795276 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95qfs"] Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.907633 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcptj\" (UniqueName: \"kubernetes.io/projected/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-kube-api-access-zcptj\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.907843 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-catalog-content\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.907960 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-utilities\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.983174 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3aac041-08a3-4d4e-aada-69ada2387b41" containerID="e90ffabcee6daae079e4fe02d2e922b19f3e911c8be0d19eca26541b2a911163" exitCode=0 Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.983260 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4fbj" event={"ID":"a3aac041-08a3-4d4e-aada-69ada2387b41","Type":"ContainerDied","Data":"e90ffabcee6daae079e4fe02d2e922b19f3e911c8be0d19eca26541b2a911163"} Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.988745 4901 generic.go:334] "Generic (PLEG): container finished" podID="86b5a97c-571b-418c-af97-26b04dec66ac" containerID="3b030cc5af3018dd51e6f37c563007d216b56472106bf4d7eec4b2040ef63aa2" exitCode=0 Feb 02 10:44:39 crc kubenswrapper[4901]: I0202 10:44:39.988798 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7pnc" event={"ID":"86b5a97c-571b-418c-af97-26b04dec66ac","Type":"ContainerDied","Data":"3b030cc5af3018dd51e6f37c563007d216b56472106bf4d7eec4b2040ef63aa2"} Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.008539 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-catalog-content\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.008627 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-utilities\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.008654 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcptj\" (UniqueName: \"kubernetes.io/projected/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-kube-api-access-zcptj\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.009092 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-catalog-content\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.009124 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-utilities\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.028867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcptj\" (UniqueName: \"kubernetes.io/projected/7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e-kube-api-access-zcptj\") pod \"redhat-operators-95qfs\" (UID: \"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e\") " pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.095922 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.313046 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95qfs"] Feb 02 10:44:40 crc kubenswrapper[4901]: W0202 10:44:40.322292 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a5d5ce8_6c11_4a6a_8e35_1b4809458b8e.slice/crio-8e7782cd930248340fc01d4701fea96fffa102491d11defe2fb3d99c036443c2 WatchSource:0}: Error finding container 8e7782cd930248340fc01d4701fea96fffa102491d11defe2fb3d99c036443c2: Status 404 returned error can't find the container with id 8e7782cd930248340fc01d4701fea96fffa102491d11defe2fb3d99c036443c2 Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.756755 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nmgnj"] Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.758440 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.760971 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.765994 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgnj"] Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.920963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stdch\" (UniqueName: \"kubernetes.io/projected/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-kube-api-access-stdch\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.921160 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-utilities\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.921238 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-catalog-content\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.995515 4901 generic.go:334] "Generic (PLEG): container finished" podID="7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e" containerID="cc74119ad33e28461519f8b97e227d253b62157d54362f645282b37eb0170afa" exitCode=0 Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.995603 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95qfs" event={"ID":"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e","Type":"ContainerDied","Data":"cc74119ad33e28461519f8b97e227d253b62157d54362f645282b37eb0170afa"} Feb 02 10:44:40 crc kubenswrapper[4901]: I0202 10:44:40.995630 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95qfs" event={"ID":"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e","Type":"ContainerStarted","Data":"8e7782cd930248340fc01d4701fea96fffa102491d11defe2fb3d99c036443c2"} Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.002728 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4fbj" event={"ID":"a3aac041-08a3-4d4e-aada-69ada2387b41","Type":"ContainerStarted","Data":"7e84b8c769c97f16f680d7fab7095be9eb39a08fd5606b57ccd18f95f2deffac"} Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.006466 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g7pnc" event={"ID":"86b5a97c-571b-418c-af97-26b04dec66ac","Type":"ContainerStarted","Data":"7682e86716541b30c74afb2d05845853e61ac1d874c06b7fb3661016c8cad4bf"} Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.022423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stdch\" (UniqueName: \"kubernetes.io/projected/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-kube-api-access-stdch\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.022519 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-utilities\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.022582 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-catalog-content\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.023144 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-catalog-content\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.023469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-utilities\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.057808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stdch\" (UniqueName: \"kubernetes.io/projected/4d1986be-5828-4d64-9da9-ffe87c0eb7ff-kube-api-access-stdch\") pod \"community-operators-nmgnj\" (UID: \"4d1986be-5828-4d64-9da9-ffe87c0eb7ff\") " pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.071758 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g7pnc" podStartSLOduration=2.552425026 podStartE2EDuration="4.071740805s" podCreationTimestamp="2026-02-02 10:44:37 +0000 UTC" firstStartedPulling="2026-02-02 10:44:38.973507552 +0000 UTC m=+365.991847648" lastFinishedPulling="2026-02-02 10:44:40.492823331 +0000 UTC m=+367.511163427" observedRunningTime="2026-02-02 10:44:41.050370436 +0000 UTC m=+368.068710552" watchObservedRunningTime="2026-02-02 10:44:41.071740805 +0000 UTC m=+368.090080901" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.072039 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4fbj" podStartSLOduration=1.680279644 podStartE2EDuration="3.072035842s" podCreationTimestamp="2026-02-02 10:44:38 +0000 UTC" firstStartedPulling="2026-02-02 10:44:38.978203613 +0000 UTC m=+365.996543719" lastFinishedPulling="2026-02-02 10:44:40.369959821 +0000 UTC m=+367.388299917" observedRunningTime="2026-02-02 10:44:41.068330113 +0000 UTC m=+368.086670219" watchObservedRunningTime="2026-02-02 10:44:41.072035842 +0000 UTC m=+368.090375928" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.073738 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:41 crc kubenswrapper[4901]: I0202 10:44:41.336536 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nmgnj"] Feb 02 10:44:41 crc kubenswrapper[4901]: W0202 10:44:41.344776 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1986be_5828_4d64_9da9_ffe87c0eb7ff.slice/crio-146b428e1b1222cf40a2d32e76f62d3f6933ecbd967b33dce9c39925b7dfaa82 WatchSource:0}: Error finding container 146b428e1b1222cf40a2d32e76f62d3f6933ecbd967b33dce9c39925b7dfaa82: Status 404 returned error can't find the container with id 146b428e1b1222cf40a2d32e76f62d3f6933ecbd967b33dce9c39925b7dfaa82 Feb 02 10:44:42 crc kubenswrapper[4901]: I0202 10:44:42.013179 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95qfs" event={"ID":"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e","Type":"ContainerStarted","Data":"4c06825234d0109860c8b68b46e64dc39e240bb4b4dc6d4cec2495e36da3c738"} Feb 02 10:44:42 crc kubenswrapper[4901]: I0202 10:44:42.015163 4901 generic.go:334] "Generic (PLEG): container finished" podID="4d1986be-5828-4d64-9da9-ffe87c0eb7ff" containerID="c1b64180c78f1ac3c9560a9278448f2aaaadc7a542e58e618052d9b7d8d3eb69" exitCode=0 Feb 02 10:44:42 crc kubenswrapper[4901]: I0202 10:44:42.015288 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgnj" event={"ID":"4d1986be-5828-4d64-9da9-ffe87c0eb7ff","Type":"ContainerDied","Data":"c1b64180c78f1ac3c9560a9278448f2aaaadc7a542e58e618052d9b7d8d3eb69"} Feb 02 10:44:42 crc kubenswrapper[4901]: I0202 10:44:42.015342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgnj" event={"ID":"4d1986be-5828-4d64-9da9-ffe87c0eb7ff","Type":"ContainerStarted","Data":"146b428e1b1222cf40a2d32e76f62d3f6933ecbd967b33dce9c39925b7dfaa82"} Feb 02 10:44:42 crc kubenswrapper[4901]: I0202 10:44:42.854174 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" podUID="dc8db928-4418-4963-892f-df5413ed2c76" containerName="registry" containerID="cri-o://53d2fc29aac69e45fc296a780716fa5bb9d138c810d11baa32aca32e822b9b5d" gracePeriod=30 Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.026423 4901 generic.go:334] "Generic (PLEG): container finished" podID="dc8db928-4418-4963-892f-df5413ed2c76" containerID="53d2fc29aac69e45fc296a780716fa5bb9d138c810d11baa32aca32e822b9b5d" exitCode=0 Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.026995 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" event={"ID":"dc8db928-4418-4963-892f-df5413ed2c76","Type":"ContainerDied","Data":"53d2fc29aac69e45fc296a780716fa5bb9d138c810d11baa32aca32e822b9b5d"} Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.029712 4901 generic.go:334] "Generic (PLEG): container finished" podID="7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e" containerID="4c06825234d0109860c8b68b46e64dc39e240bb4b4dc6d4cec2495e36da3c738" exitCode=0 Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.029776 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95qfs" event={"ID":"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e","Type":"ContainerDied","Data":"4c06825234d0109860c8b68b46e64dc39e240bb4b4dc6d4cec2495e36da3c738"} Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.033731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgnj" event={"ID":"4d1986be-5828-4d64-9da9-ffe87c0eb7ff","Type":"ContainerStarted","Data":"a9dbce043984b3c01da3d607100920f8fdb0686ebffe86782a631efa829ecd79"} Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.271755 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.455856 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456247 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456284 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456368 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456394 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzrc\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456459 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.456497 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates\") pod \"dc8db928-4418-4963-892f-df5413ed2c76\" (UID: \"dc8db928-4418-4963-892f-df5413ed2c76\") " Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.457045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.457541 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.463237 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.463851 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc" (OuterVolumeSpecName: "kube-api-access-lqzrc") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "kube-api-access-lqzrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.465778 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.472770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.480486 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.482227 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "dc8db928-4418-4963-892f-df5413ed2c76" (UID: "dc8db928-4418-4963-892f-df5413ed2c76"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557764 4901 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc8db928-4418-4963-892f-df5413ed2c76-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557787 4901 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc8db928-4418-4963-892f-df5413ed2c76-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557799 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzrc\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-kube-api-access-lqzrc\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557809 4901 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557818 4901 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8db928-4418-4963-892f-df5413ed2c76-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557827 4901 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4901]: I0202 10:44:43.557834 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8db928-4418-4963-892f-df5413ed2c76-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.040327 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.040332 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xvfbq" event={"ID":"dc8db928-4418-4963-892f-df5413ed2c76","Type":"ContainerDied","Data":"1574dbcf9e692cc765c3c2f83b2b4d9650fd9cd1394c09bd56a58ae679a1f9a6"} Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.040939 4901 scope.go:117] "RemoveContainer" containerID="53d2fc29aac69e45fc296a780716fa5bb9d138c810d11baa32aca32e822b9b5d" Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.043957 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95qfs" event={"ID":"7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e","Type":"ContainerStarted","Data":"80f2fadab28160b368c396891a22383e59652aafd49f15c3644593f4f6a7f37e"} Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.050183 4901 generic.go:334] "Generic (PLEG): container finished" podID="4d1986be-5828-4d64-9da9-ffe87c0eb7ff" containerID="a9dbce043984b3c01da3d607100920f8fdb0686ebffe86782a631efa829ecd79" exitCode=0 Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.050231 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgnj" event={"ID":"4d1986be-5828-4d64-9da9-ffe87c0eb7ff","Type":"ContainerDied","Data":"a9dbce043984b3c01da3d607100920f8fdb0686ebffe86782a631efa829ecd79"} Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.065002 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.073328 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xvfbq"] Feb 02 10:44:44 crc kubenswrapper[4901]: I0202 10:44:44.100160 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95qfs" podStartSLOduration=2.60155721 podStartE2EDuration="5.10013074s" podCreationTimestamp="2026-02-02 10:44:39 +0000 UTC" firstStartedPulling="2026-02-02 10:44:40.997161611 +0000 UTC m=+368.015501707" lastFinishedPulling="2026-02-02 10:44:43.495735131 +0000 UTC m=+370.514075237" observedRunningTime="2026-02-02 10:44:44.091511574 +0000 UTC m=+371.109851690" watchObservedRunningTime="2026-02-02 10:44:44.10013074 +0000 UTC m=+371.118470836" Feb 02 10:44:45 crc kubenswrapper[4901]: I0202 10:44:45.060359 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nmgnj" event={"ID":"4d1986be-5828-4d64-9da9-ffe87c0eb7ff","Type":"ContainerStarted","Data":"c42779247e7b0130063dfaffc848e51b9aa09fc855117a1aab2e70aaf04c2e85"} Feb 02 10:44:45 crc kubenswrapper[4901]: I0202 10:44:45.082933 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nmgnj" podStartSLOduration=2.417802255 podStartE2EDuration="5.082901614s" podCreationTimestamp="2026-02-02 10:44:40 +0000 UTC" firstStartedPulling="2026-02-02 10:44:42.017904028 +0000 UTC m=+369.036244124" lastFinishedPulling="2026-02-02 10:44:44.683003387 +0000 UTC m=+371.701343483" observedRunningTime="2026-02-02 10:44:45.078253353 +0000 UTC m=+372.096593449" watchObservedRunningTime="2026-02-02 10:44:45.082901614 +0000 UTC m=+372.101241710" Feb 02 10:44:45 crc kubenswrapper[4901]: I0202 10:44:45.691332 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8db928-4418-4963-892f-df5413ed2c76" path="/var/lib/kubelet/pods/dc8db928-4418-4963-892f-df5413ed2c76/volumes" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.305300 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.308240 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.357724 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.505712 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.506132 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:48 crc kubenswrapper[4901]: I0202 10:44:48.548932 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:49 crc kubenswrapper[4901]: I0202 10:44:49.129295 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g7pnc" Feb 02 10:44:49 crc kubenswrapper[4901]: I0202 10:44:49.145557 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4fbj" Feb 02 10:44:50 crc kubenswrapper[4901]: I0202 10:44:50.096929 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:50 crc kubenswrapper[4901]: I0202 10:44:50.096994 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:50 crc kubenswrapper[4901]: I0202 10:44:50.138218 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:51 crc kubenswrapper[4901]: I0202 10:44:51.074019 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:51 crc kubenswrapper[4901]: I0202 10:44:51.074232 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:51 crc kubenswrapper[4901]: I0202 10:44:51.120180 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:44:51 crc kubenswrapper[4901]: I0202 10:44:51.131800 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95qfs" Feb 02 10:44:52 crc kubenswrapper[4901]: I0202 10:44:52.147387 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nmgnj" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.161082 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s"] Feb 02 10:45:00 crc kubenswrapper[4901]: E0202 10:45:00.161940 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8db928-4418-4963-892f-df5413ed2c76" containerName="registry" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.161952 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8db928-4418-4963-892f-df5413ed2c76" containerName="registry" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.162053 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8db928-4418-4963-892f-df5413ed2c76" containerName="registry" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.162495 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.165795 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.165894 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.173300 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s"] Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.298599 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.298686 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.298782 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pdb\" (UniqueName: \"kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.400123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.400175 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.400244 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pdb\" (UniqueName: \"kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.401888 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.409126 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.418450 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pdb\" (UniqueName: \"kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb\") pod \"collect-profiles-29500485-4vf9s\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.534020 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:00 crc kubenswrapper[4901]: I0202 10:45:00.735249 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s"] Feb 02 10:45:01 crc kubenswrapper[4901]: I0202 10:45:01.152549 4901 generic.go:334] "Generic (PLEG): container finished" podID="36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" containerID="afe07750fce0a9d7dac4de0d58069f2e8ad39c24ac336a19f72ff5b8b0cdacef" exitCode=0 Feb 02 10:45:01 crc kubenswrapper[4901]: I0202 10:45:01.152648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" event={"ID":"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18","Type":"ContainerDied","Data":"afe07750fce0a9d7dac4de0d58069f2e8ad39c24ac336a19f72ff5b8b0cdacef"} Feb 02 10:45:01 crc kubenswrapper[4901]: I0202 10:45:01.152720 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" event={"ID":"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18","Type":"ContainerStarted","Data":"202f287a2c9837d30a63cbf15742ec11b69cdb289b9b567322894a993e0a99fd"} Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.415416 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.529361 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9pdb\" (UniqueName: \"kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb\") pod \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.529467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume\") pod \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.529603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume\") pod \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\" (UID: \"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18\") " Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.530948 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume" (OuterVolumeSpecName: "config-volume") pod "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" (UID: "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.543034 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" (UID: "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.543124 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb" (OuterVolumeSpecName: "kube-api-access-w9pdb") pod "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" (UID: "36a3c890-d6dc-4c8f-bcca-e0cc0461ea18"). InnerVolumeSpecName "kube-api-access-w9pdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.631442 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9pdb\" (UniqueName: \"kubernetes.io/projected/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-kube-api-access-w9pdb\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.631521 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:02 crc kubenswrapper[4901]: I0202 10:45:02.631541 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:03 crc kubenswrapper[4901]: I0202 10:45:03.178017 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" event={"ID":"36a3c890-d6dc-4c8f-bcca-e0cc0461ea18","Type":"ContainerDied","Data":"202f287a2c9837d30a63cbf15742ec11b69cdb289b9b567322894a993e0a99fd"} Feb 02 10:45:03 crc kubenswrapper[4901]: I0202 10:45:03.178089 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202f287a2c9837d30a63cbf15742ec11b69cdb289b9b567322894a993e0a99fd" Feb 02 10:45:03 crc kubenswrapper[4901]: I0202 10:45:03.178105 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s" Feb 02 10:45:07 crc kubenswrapper[4901]: I0202 10:45:07.837900 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:45:07 crc kubenswrapper[4901]: I0202 10:45:07.838781 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:45:37 crc kubenswrapper[4901]: I0202 10:45:37.837306 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:45:37 crc kubenswrapper[4901]: I0202 10:45:37.838370 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:45:37 crc kubenswrapper[4901]: I0202 10:45:37.838470 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:45:37 crc kubenswrapper[4901]: I0202 10:45:37.839781 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:45:37 crc kubenswrapper[4901]: I0202 10:45:37.839917 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a" gracePeriod=600 Feb 02 10:45:38 crc kubenswrapper[4901]: I0202 10:45:38.409160 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a" exitCode=0 Feb 02 10:45:38 crc kubenswrapper[4901]: I0202 10:45:38.409252 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a"} Feb 02 10:45:38 crc kubenswrapper[4901]: I0202 10:45:38.409779 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488"} Feb 02 10:45:38 crc kubenswrapper[4901]: I0202 10:45:38.409824 4901 scope.go:117] "RemoveContainer" containerID="4b772f8291c7ba36ce56d88dfbde16bf9344276419544da4c966fb2bbee6e04d" Feb 02 10:48:07 crc kubenswrapper[4901]: I0202 10:48:07.837587 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:48:07 crc kubenswrapper[4901]: I0202 10:48:07.838154 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:48:37 crc kubenswrapper[4901]: I0202 10:48:37.837436 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:48:37 crc kubenswrapper[4901]: I0202 10:48:37.838203 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.169725 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8"] Feb 02 10:48:51 crc kubenswrapper[4901]: E0202 10:48:51.170549 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" containerName="collect-profiles" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.170784 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" containerName="collect-profiles" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.170955 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" containerName="collect-profiles" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.171813 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.174046 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.181460 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.181955 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-shrrn" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.190736 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.220184 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-724k6"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.221241 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-724k6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.223103 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6rnf6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.224727 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcqhh"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.226325 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.228766 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-724k6"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.229597 4901 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bxwjb" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.239047 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcqhh"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.356228 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstb5\" (UniqueName: \"kubernetes.io/projected/69fa051d-cf95-4af2-8d67-990aece23a2c-kube-api-access-tstb5\") pod \"cert-manager-webhook-687f57d79b-qcqhh\" (UID: \"69fa051d-cf95-4af2-8d67-990aece23a2c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.356348 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqll\" (UniqueName: \"kubernetes.io/projected/535bd180-9d21-44be-be15-dbc0f6fa94cf-kube-api-access-jbqll\") pod \"cert-manager-858654f9db-724k6\" (UID: \"535bd180-9d21-44be-be15-dbc0f6fa94cf\") " pod="cert-manager/cert-manager-858654f9db-724k6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.356380 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdj9\" (UniqueName: \"kubernetes.io/projected/00e3fdef-9614-441e-b027-ad0d29e7f1a8-kube-api-access-mfdj9\") pod \"cert-manager-cainjector-cf98fcc89-xcbb8\" (UID: \"00e3fdef-9614-441e-b027-ad0d29e7f1a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.457805 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqll\" (UniqueName: \"kubernetes.io/projected/535bd180-9d21-44be-be15-dbc0f6fa94cf-kube-api-access-jbqll\") pod \"cert-manager-858654f9db-724k6\" (UID: \"535bd180-9d21-44be-be15-dbc0f6fa94cf\") " pod="cert-manager/cert-manager-858654f9db-724k6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.457892 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdj9\" (UniqueName: \"kubernetes.io/projected/00e3fdef-9614-441e-b027-ad0d29e7f1a8-kube-api-access-mfdj9\") pod \"cert-manager-cainjector-cf98fcc89-xcbb8\" (UID: \"00e3fdef-9614-441e-b027-ad0d29e7f1a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.458012 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tstb5\" (UniqueName: \"kubernetes.io/projected/69fa051d-cf95-4af2-8d67-990aece23a2c-kube-api-access-tstb5\") pod \"cert-manager-webhook-687f57d79b-qcqhh\" (UID: \"69fa051d-cf95-4af2-8d67-990aece23a2c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.484973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstb5\" (UniqueName: \"kubernetes.io/projected/69fa051d-cf95-4af2-8d67-990aece23a2c-kube-api-access-tstb5\") pod \"cert-manager-webhook-687f57d79b-qcqhh\" (UID: \"69fa051d-cf95-4af2-8d67-990aece23a2c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.485233 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdj9\" (UniqueName: \"kubernetes.io/projected/00e3fdef-9614-441e-b027-ad0d29e7f1a8-kube-api-access-mfdj9\") pod \"cert-manager-cainjector-cf98fcc89-xcbb8\" (UID: \"00e3fdef-9614-441e-b027-ad0d29e7f1a8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.485348 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqll\" (UniqueName: \"kubernetes.io/projected/535bd180-9d21-44be-be15-dbc0f6fa94cf-kube-api-access-jbqll\") pod \"cert-manager-858654f9db-724k6\" (UID: \"535bd180-9d21-44be-be15-dbc0f6fa94cf\") " pod="cert-manager/cert-manager-858654f9db-724k6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.522045 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.533834 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-724k6" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.542893 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.733805 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8"] Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.747388 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.990479 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-724k6"] Feb 02 10:48:51 crc kubenswrapper[4901]: W0202 10:48:51.994105 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod535bd180_9d21_44be_be15_dbc0f6fa94cf.slice/crio-8bcfc854ccca055938ff864f79e8038c986899bbe452b83dd9c81a916f2ecb95 WatchSource:0}: Error finding container 8bcfc854ccca055938ff864f79e8038c986899bbe452b83dd9c81a916f2ecb95: Status 404 returned error can't find the container with id 8bcfc854ccca055938ff864f79e8038c986899bbe452b83dd9c81a916f2ecb95 Feb 02 10:48:51 crc kubenswrapper[4901]: I0202 10:48:51.997851 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-qcqhh"] Feb 02 10:48:52 crc kubenswrapper[4901]: W0202 10:48:52.003670 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69fa051d_cf95_4af2_8d67_990aece23a2c.slice/crio-da526ee616cb1831b61cf4c4b2caa35836a64134d9f00f233a4a620207508122 WatchSource:0}: Error finding container da526ee616cb1831b61cf4c4b2caa35836a64134d9f00f233a4a620207508122: Status 404 returned error can't find the container with id da526ee616cb1831b61cf4c4b2caa35836a64134d9f00f233a4a620207508122 Feb 02 10:48:52 crc kubenswrapper[4901]: I0202 10:48:52.028838 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-724k6" event={"ID":"535bd180-9d21-44be-be15-dbc0f6fa94cf","Type":"ContainerStarted","Data":"8bcfc854ccca055938ff864f79e8038c986899bbe452b83dd9c81a916f2ecb95"} Feb 02 10:48:52 crc kubenswrapper[4901]: I0202 10:48:52.030207 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" event={"ID":"00e3fdef-9614-441e-b027-ad0d29e7f1a8","Type":"ContainerStarted","Data":"9043b9aeb09901718d4f74ec57865c9b2d5ae15c0d20dddf007c3ed5e5555e55"} Feb 02 10:48:52 crc kubenswrapper[4901]: I0202 10:48:52.031112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" event={"ID":"69fa051d-cf95-4af2-8d67-990aece23a2c","Type":"ContainerStarted","Data":"da526ee616cb1831b61cf4c4b2caa35836a64134d9f00f233a4a620207508122"} Feb 02 10:48:55 crc kubenswrapper[4901]: I0202 10:48:55.054387 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" event={"ID":"00e3fdef-9614-441e-b027-ad0d29e7f1a8","Type":"ContainerStarted","Data":"1cfa835e97052f9cd37ef028bd2b73f86d1c41539dc321ebda4d9ab8b4477097"} Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.067258 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-724k6" event={"ID":"535bd180-9d21-44be-be15-dbc0f6fa94cf","Type":"ContainerStarted","Data":"892c1be1ffcc7c18c1d6565140193c21187e1eed1ac0bbc1d3a7c73de7d0c17f"} Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.070365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" event={"ID":"69fa051d-cf95-4af2-8d67-990aece23a2c","Type":"ContainerStarted","Data":"d79318e930c1bdf307f4aed9c229ec5c2b246bca777a2db0307b3b9e00c77955"} Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.070837 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.082980 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xcbb8" podStartSLOduration=3.539746837 podStartE2EDuration="6.082956151s" podCreationTimestamp="2026-02-02 10:48:51 +0000 UTC" firstStartedPulling="2026-02-02 10:48:51.747121906 +0000 UTC m=+618.765462002" lastFinishedPulling="2026-02-02 10:48:54.29033122 +0000 UTC m=+621.308671316" observedRunningTime="2026-02-02 10:48:55.078997403 +0000 UTC m=+622.097337499" watchObservedRunningTime="2026-02-02 10:48:57.082956151 +0000 UTC m=+624.101296277" Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.085363 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-724k6" podStartSLOduration=1.838556947 podStartE2EDuration="6.085344198s" podCreationTimestamp="2026-02-02 10:48:51 +0000 UTC" firstStartedPulling="2026-02-02 10:48:51.996905522 +0000 UTC m=+619.015245618" lastFinishedPulling="2026-02-02 10:48:56.243692763 +0000 UTC m=+623.262032869" observedRunningTime="2026-02-02 10:48:57.080903422 +0000 UTC m=+624.099243518" watchObservedRunningTime="2026-02-02 10:48:57.085344198 +0000 UTC m=+624.103684334" Feb 02 10:48:57 crc kubenswrapper[4901]: I0202 10:48:57.103595 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" podStartSLOduration=1.8691809209999999 podStartE2EDuration="6.103551675s" podCreationTimestamp="2026-02-02 10:48:51 +0000 UTC" firstStartedPulling="2026-02-02 10:48:52.005607571 +0000 UTC m=+619.023947667" lastFinishedPulling="2026-02-02 10:48:56.239978315 +0000 UTC m=+623.258318421" observedRunningTime="2026-02-02 10:48:57.103184517 +0000 UTC m=+624.121524663" watchObservedRunningTime="2026-02-02 10:48:57.103551675 +0000 UTC m=+624.121891771" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.077410 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vm8h5"] Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078033 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-controller" containerID="cri-o://3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078350 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="sbdb" containerID="cri-o://23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078387 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="nbdb" containerID="cri-o://b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078419 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="northd" containerID="cri-o://1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078449 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078477 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-node" containerID="cri-o://61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.078505 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-acl-logging" containerID="cri-o://18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.118983 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" containerID="cri-o://853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" gracePeriod=30 Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.333684 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3390481_846a_4742_9eae_0796b667897f.slice/crio-23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3390481_846a_4742_9eae_0796b667897f.slice/crio-b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3390481_846a_4742_9eae_0796b667897f.slice/crio-conmon-1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.367936 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/3.log" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.369910 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovn-acl-logging/0.log" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.370441 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovn-controller/0.log" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.370845 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408421 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408484 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408504 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408531 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408576 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408602 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408633 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwwwn\" (UniqueName: \"kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408658 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408690 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408764 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408788 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408862 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408893 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408971 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.408998 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.409018 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.409038 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.409248 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides\") pod \"a3390481-846a-4742-9eae-0796b667897f\" (UID: \"a3390481-846a-4742-9eae-0796b667897f\") " Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410286 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410393 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410389 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log" (OuterVolumeSpecName: "node-log") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410439 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410433 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410450 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410461 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410480 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410723 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410478 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410598 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash" (OuterVolumeSpecName: "host-slash") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410481 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410502 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410619 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410622 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket" (OuterVolumeSpecName: "log-socket") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.411052 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.410530 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.415680 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.416580 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn" (OuterVolumeSpecName: "kube-api-access-fwwwn") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "kube-api-access-fwwwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423081 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nmpn5"] Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423352 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-node" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423369 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-node" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423383 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kubecfg-setup" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423390 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kubecfg-setup" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423397 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423433 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423442 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423449 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423461 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423468 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423477 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="sbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423484 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="sbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423513 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="northd" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423520 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="northd" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423526 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="nbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423531 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="nbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423547 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-acl-logging" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423555 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-acl-logging" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423685 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423694 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.423715 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423720 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423862 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423877 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="northd" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423886 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423901 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-node" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423917 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="nbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423926 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423934 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-acl-logging" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423942 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="sbdb" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423951 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovn-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.423957 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.424046 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.424054 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: E0202 10:49:01.424061 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.424067 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.424116 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a3390481-846a-4742-9eae-0796b667897f" (UID: "a3390481-846a-4742-9eae-0796b667897f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.424146 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.424637 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3390481-846a-4742-9eae-0796b667897f" containerName="ovnkube-controller" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.426484 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510555 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-log-socket\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510728 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-etc-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510771 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-slash\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510788 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-ovn\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510810 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-node-log\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510866 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-env-overrides\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.510960 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511024 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-netns\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-script-lib\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511086 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-netd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovn-node-metrics-cert\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511185 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-var-lib-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511211 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511243 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-bin\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511266 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbql7\" (UniqueName: \"kubernetes.io/projected/17e4fe64-c924-4858-8dd4-90fd3c95aebc-kube-api-access-dbql7\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511291 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-config\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511313 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-systemd-units\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-kubelet\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511358 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-systemd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511408 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511465 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwwwn\" (UniqueName: \"kubernetes.io/projected/a3390481-846a-4742-9eae-0796b667897f-kube-api-access-fwwwn\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511479 4901 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511491 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511505 4901 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511517 4901 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-node-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511530 4901 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511544 4901 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511556 4901 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511594 4901 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511606 4901 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511616 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3390481-846a-4742-9eae-0796b667897f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511628 4901 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511638 4901 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511649 4901 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3390481-846a-4742-9eae-0796b667897f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511660 4901 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511671 4901 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511680 4901 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511691 4901 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511701 4901 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.511711 4901 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3390481-846a-4742-9eae-0796b667897f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.546279 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-qcqhh" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612596 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-env-overrides\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612664 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-netns\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612726 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-script-lib\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612748 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-netd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612772 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovn-node-metrics-cert\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612795 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-var-lib-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612854 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612881 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-bin\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612904 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-var-lib-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612934 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-bin\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612962 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-cni-netd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612911 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-netns\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.612911 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbql7\" (UniqueName: \"kubernetes.io/projected/17e4fe64-c924-4858-8dd4-90fd3c95aebc-kube-api-access-dbql7\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-config\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613091 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-systemd-units\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613112 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-kubelet\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-systemd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613176 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-log-socket\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613223 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-etc-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613245 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-slash\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613287 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-ovn\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613308 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-node-log\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613381 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-node-log\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613402 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-systemd\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613422 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613443 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-log-socket\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-etc-openvswitch\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613484 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-slash\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613513 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-run-ovn\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613286 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-host-kubelet\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-script-lib\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.613989 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17e4fe64-c924-4858-8dd4-90fd3c95aebc-systemd-units\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.614071 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-env-overrides\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.614083 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovnkube-config\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.616283 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17e4fe64-c924-4858-8dd4-90fd3c95aebc-ovn-node-metrics-cert\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.627339 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbql7\" (UniqueName: \"kubernetes.io/projected/17e4fe64-c924-4858-8dd4-90fd3c95aebc-kube-api-access-dbql7\") pod \"ovnkube-node-nmpn5\" (UID: \"17e4fe64-c924-4858-8dd4-90fd3c95aebc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:01 crc kubenswrapper[4901]: I0202 10:49:01.757227 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.115392 4901 generic.go:334] "Generic (PLEG): container finished" podID="17e4fe64-c924-4858-8dd4-90fd3c95aebc" containerID="eef7225268ee0f9f240e990bb02b41fdae6d78bff5a03324c03780da6206fe54" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.115476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerDied","Data":"eef7225268ee0f9f240e990bb02b41fdae6d78bff5a03324c03780da6206fe54"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.115897 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"eb1d590eed57ccd7b2fedf5715064dfec44eb71e94184b8fa071120cb3fae945"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.120231 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/2.log" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.120840 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/1.log" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.120884 4901 generic.go:334] "Generic (PLEG): container finished" podID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" containerID="6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177" exitCode=2 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.120983 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerDied","Data":"6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.121051 4901 scope.go:117] "RemoveContainer" containerID="15d92feb87ef4644f20d56395e4ec742bb94c251371c8198e0d7257c3d21a68b" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.121945 4901 scope.go:117] "RemoveContainer" containerID="6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.122674 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5q92h_openshift-multus(19eb421a-49aa-4cde-ae5e-3aba70ee67f4)\"" pod="openshift-multus/multus-5q92h" podUID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.127055 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovnkube-controller/3.log" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.131599 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovn-acl-logging/0.log" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.133161 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8h5_a3390481-846a-4742-9eae-0796b667897f/ovn-controller/0.log" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134445 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134521 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134579 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134607 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134668 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134683 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134693 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" exitCode=0 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134702 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" exitCode=143 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134632 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134794 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134849 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134544 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134878 4901 generic.go:334] "Generic (PLEG): container finished" podID="a3390481-846a-4742-9eae-0796b667897f" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" exitCode=143 Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134864 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134973 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134987 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.134998 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135004 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135010 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135016 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135022 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135028 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135033 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135038 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135043 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135051 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135059 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135066 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135071 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135076 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135082 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135086 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135091 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135096 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135101 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135105 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135112 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135121 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135127 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135133 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135138 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135143 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135150 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135155 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135160 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135166 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135171 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8h5" event={"ID":"a3390481-846a-4742-9eae-0796b667897f","Type":"ContainerDied","Data":"0054f47c122e18b40aa2c7c792663c19c5a9d2a27c75c0e3a58723113244d24a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135186 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135192 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135197 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135203 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135208 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135213 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135218 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135223 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135228 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.135232 4901 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.160329 4901 scope.go:117] "RemoveContainer" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.199169 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.207708 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vm8h5"] Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.223965 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vm8h5"] Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.249442 4901 scope.go:117] "RemoveContainer" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.267555 4901 scope.go:117] "RemoveContainer" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.289323 4901 scope.go:117] "RemoveContainer" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.321524 4901 scope.go:117] "RemoveContainer" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.345272 4901 scope.go:117] "RemoveContainer" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.370067 4901 scope.go:117] "RemoveContainer" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.385739 4901 scope.go:117] "RemoveContainer" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.400590 4901 scope.go:117] "RemoveContainer" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.418269 4901 scope.go:117] "RemoveContainer" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.420189 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": container with ID starting with 853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619 not found: ID does not exist" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420225 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} err="failed to get container status \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": rpc error: code = NotFound desc = could not find container \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": container with ID starting with 853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420268 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.420555 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": container with ID starting with 53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8 not found: ID does not exist" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420590 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} err="failed to get container status \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": rpc error: code = NotFound desc = could not find container \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": container with ID starting with 53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420604 4901 scope.go:117] "RemoveContainer" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.420907 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": container with ID starting with 23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a not found: ID does not exist" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420938 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} err="failed to get container status \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": rpc error: code = NotFound desc = could not find container \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": container with ID starting with 23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.420956 4901 scope.go:117] "RemoveContainer" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.423002 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": container with ID starting with b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81 not found: ID does not exist" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423036 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} err="failed to get container status \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": rpc error: code = NotFound desc = could not find container \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": container with ID starting with b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423056 4901 scope.go:117] "RemoveContainer" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.423406 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": container with ID starting with 1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79 not found: ID does not exist" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423431 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} err="failed to get container status \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": rpc error: code = NotFound desc = could not find container \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": container with ID starting with 1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423450 4901 scope.go:117] "RemoveContainer" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.423884 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": container with ID starting with 2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d not found: ID does not exist" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423911 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} err="failed to get container status \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": rpc error: code = NotFound desc = could not find container \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": container with ID starting with 2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.423930 4901 scope.go:117] "RemoveContainer" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.424224 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": container with ID starting with 61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa not found: ID does not exist" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.424249 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} err="failed to get container status \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": rpc error: code = NotFound desc = could not find container \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": container with ID starting with 61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.424268 4901 scope.go:117] "RemoveContainer" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.424526 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": container with ID starting with 18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3 not found: ID does not exist" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.424552 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} err="failed to get container status \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": rpc error: code = NotFound desc = could not find container \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": container with ID starting with 18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.424586 4901 scope.go:117] "RemoveContainer" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.425030 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": container with ID starting with 3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3 not found: ID does not exist" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425094 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} err="failed to get container status \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": rpc error: code = NotFound desc = could not find container \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": container with ID starting with 3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425122 4901 scope.go:117] "RemoveContainer" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: E0202 10:49:02.425440 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": container with ID starting with 1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5 not found: ID does not exist" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425464 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} err="failed to get container status \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": rpc error: code = NotFound desc = could not find container \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": container with ID starting with 1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425482 4901 scope.go:117] "RemoveContainer" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425752 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} err="failed to get container status \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": rpc error: code = NotFound desc = could not find container \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": container with ID starting with 853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.425775 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426177 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} err="failed to get container status \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": rpc error: code = NotFound desc = could not find container \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": container with ID starting with 53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426201 4901 scope.go:117] "RemoveContainer" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426501 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} err="failed to get container status \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": rpc error: code = NotFound desc = could not find container \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": container with ID starting with 23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426526 4901 scope.go:117] "RemoveContainer" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426933 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} err="failed to get container status \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": rpc error: code = NotFound desc = could not find container \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": container with ID starting with b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.426950 4901 scope.go:117] "RemoveContainer" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427337 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} err="failed to get container status \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": rpc error: code = NotFound desc = could not find container \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": container with ID starting with 1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427365 4901 scope.go:117] "RemoveContainer" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427669 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} err="failed to get container status \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": rpc error: code = NotFound desc = could not find container \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": container with ID starting with 2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427692 4901 scope.go:117] "RemoveContainer" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427971 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} err="failed to get container status \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": rpc error: code = NotFound desc = could not find container \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": container with ID starting with 61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.427993 4901 scope.go:117] "RemoveContainer" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.428308 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} err="failed to get container status \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": rpc error: code = NotFound desc = could not find container \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": container with ID starting with 18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.428331 4901 scope.go:117] "RemoveContainer" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.428625 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} err="failed to get container status \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": rpc error: code = NotFound desc = could not find container \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": container with ID starting with 3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.428647 4901 scope.go:117] "RemoveContainer" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.429357 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} err="failed to get container status \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": rpc error: code = NotFound desc = could not find container \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": container with ID starting with 1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.429413 4901 scope.go:117] "RemoveContainer" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.429774 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} err="failed to get container status \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": rpc error: code = NotFound desc = could not find container \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": container with ID starting with 853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.429798 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.430161 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} err="failed to get container status \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": rpc error: code = NotFound desc = could not find container \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": container with ID starting with 53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.430185 4901 scope.go:117] "RemoveContainer" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.430498 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} err="failed to get container status \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": rpc error: code = NotFound desc = could not find container \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": container with ID starting with 23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.430527 4901 scope.go:117] "RemoveContainer" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.430978 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} err="failed to get container status \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": rpc error: code = NotFound desc = could not find container \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": container with ID starting with b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431000 4901 scope.go:117] "RemoveContainer" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431305 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} err="failed to get container status \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": rpc error: code = NotFound desc = could not find container \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": container with ID starting with 1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431331 4901 scope.go:117] "RemoveContainer" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431612 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} err="failed to get container status \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": rpc error: code = NotFound desc = could not find container \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": container with ID starting with 2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431653 4901 scope.go:117] "RemoveContainer" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431898 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} err="failed to get container status \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": rpc error: code = NotFound desc = could not find container \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": container with ID starting with 61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.431918 4901 scope.go:117] "RemoveContainer" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432178 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} err="failed to get container status \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": rpc error: code = NotFound desc = could not find container \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": container with ID starting with 18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432231 4901 scope.go:117] "RemoveContainer" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432539 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} err="failed to get container status \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": rpc error: code = NotFound desc = could not find container \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": container with ID starting with 3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432581 4901 scope.go:117] "RemoveContainer" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432891 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} err="failed to get container status \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": rpc error: code = NotFound desc = could not find container \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": container with ID starting with 1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.432915 4901 scope.go:117] "RemoveContainer" containerID="853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.433270 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619"} err="failed to get container status \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": rpc error: code = NotFound desc = could not find container \"853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619\": container with ID starting with 853d62254c62e5ca6a20572e291db7725dbe50919ef679111449fb8a40355619 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.433291 4901 scope.go:117] "RemoveContainer" containerID="53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.433684 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8"} err="failed to get container status \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": rpc error: code = NotFound desc = could not find container \"53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8\": container with ID starting with 53303f4cfb222a1eb4b36c8d625a1470c2e40b5b064700fa8e0c7660666678c8 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.433715 4901 scope.go:117] "RemoveContainer" containerID="23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434024 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a"} err="failed to get container status \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": rpc error: code = NotFound desc = could not find container \"23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a\": container with ID starting with 23f1fffe3e2dd771756eea50c6c8ebd7933ae7af33dd806d29c4d0dd414a5e4a not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434046 4901 scope.go:117] "RemoveContainer" containerID="b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434397 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81"} err="failed to get container status \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": rpc error: code = NotFound desc = could not find container \"b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81\": container with ID starting with b39f875e93cc845e15348f1c0ef50cc6c717bd43de637faef4eaf7ca0d4dee81 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434421 4901 scope.go:117] "RemoveContainer" containerID="1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434817 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79"} err="failed to get container status \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": rpc error: code = NotFound desc = could not find container \"1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79\": container with ID starting with 1f569366bdb3a0c935c66ad3cae2d8a83c8c73497300de06f592435b3cb23b79 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.434843 4901 scope.go:117] "RemoveContainer" containerID="2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435132 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d"} err="failed to get container status \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": rpc error: code = NotFound desc = could not find container \"2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d\": container with ID starting with 2d4046ac61e3a055c598980167d3f81b9800bc738b12a704224cd22d8b6fd67d not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435154 4901 scope.go:117] "RemoveContainer" containerID="61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435480 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa"} err="failed to get container status \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": rpc error: code = NotFound desc = could not find container \"61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa\": container with ID starting with 61cb132e8ef44974d56a9d00d5323df1f2687ae70ed5f4db83ba1b86ec6cccfa not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435509 4901 scope.go:117] "RemoveContainer" containerID="18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435755 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3"} err="failed to get container status \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": rpc error: code = NotFound desc = could not find container \"18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3\": container with ID starting with 18d36fc4001f049fbc3c68eddda09d90b53825c716ee2d0c9af6811b655086a3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.435775 4901 scope.go:117] "RemoveContainer" containerID="3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.436066 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3"} err="failed to get container status \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": rpc error: code = NotFound desc = could not find container \"3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3\": container with ID starting with 3a8ad188a1d92454554ed733e99aa862b7d4be329337040bf45a9f36c2c064c3 not found: ID does not exist" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.436088 4901 scope.go:117] "RemoveContainer" containerID="1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5" Feb 02 10:49:02 crc kubenswrapper[4901]: I0202 10:49:02.436335 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5"} err="failed to get container status \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": rpc error: code = NotFound desc = could not find container \"1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5\": container with ID starting with 1d01d153c1c42a978683c3bc6e677130b8d72248a128e21955686524799f8dd5 not found: ID does not exist" Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"0161f7be4433b62f67c72ceac06ad3303aee4195b76fd59c253be9d353416fc7"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"ed88d387f36dfc16b33d04c6292e3662c25badb61e29aab7df246a98db9aa406"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144805 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"baca59d3170b0c0e9f7e445023bfe2fb2f092f6f2b94528349bef6dd63b3e6cf"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144819 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"a641bc209d06884b214fe1f5b0f8d47e244e61ac3dbf31aeab457a5a07bfd611"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144835 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"a338a2eb71a6a24979bafe56489e39d74e58d60e3b066a0ec13440cf92b00313"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.144851 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"25b47c186cdf256afff380294f8df3e9462b82e16e590cb49d309bd309d272e1"} Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.146772 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/2.log" Feb 02 10:49:03 crc kubenswrapper[4901]: I0202 10:49:03.688156 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3390481-846a-4742-9eae-0796b667897f" path="/var/lib/kubelet/pods/a3390481-846a-4742-9eae-0796b667897f/volumes" Feb 02 10:49:05 crc kubenswrapper[4901]: I0202 10:49:05.167787 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"91544cc96c4a113b7fead25de838a52c7265431e7e1cff682ca89484ba7e90cd"} Feb 02 10:49:07 crc kubenswrapper[4901]: I0202 10:49:07.837828 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:49:07 crc kubenswrapper[4901]: I0202 10:49:07.838118 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:49:07 crc kubenswrapper[4901]: I0202 10:49:07.838171 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:49:07 crc kubenswrapper[4901]: I0202 10:49:07.838843 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:49:07 crc kubenswrapper[4901]: I0202 10:49:07.838907 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488" gracePeriod=600 Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.190403 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" event={"ID":"17e4fe64-c924-4858-8dd4-90fd3c95aebc","Type":"ContainerStarted","Data":"81dadbbec1d8fa0a04a91a78e1896aea0ae740584b1bf886ba570b475b1d8f4e"} Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.190688 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.190718 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.194810 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488" exitCode=0 Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.194865 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488"} Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.194900 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4"} Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.194929 4901 scope.go:117] "RemoveContainer" containerID="cbeddfab711d3ba3246547960da089ade81e7d31bea07298d153502257c2da4a" Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.215441 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:08 crc kubenswrapper[4901]: I0202 10:49:08.221874 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" podStartSLOduration=7.221859638 podStartE2EDuration="7.221859638s" podCreationTimestamp="2026-02-02 10:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:49:08.220200079 +0000 UTC m=+635.238540185" watchObservedRunningTime="2026-02-02 10:49:08.221859638 +0000 UTC m=+635.240199734" Feb 02 10:49:09 crc kubenswrapper[4901]: I0202 10:49:09.207134 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:09 crc kubenswrapper[4901]: I0202 10:49:09.253374 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:13 crc kubenswrapper[4901]: I0202 10:49:13.682176 4901 scope.go:117] "RemoveContainer" containerID="6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177" Feb 02 10:49:13 crc kubenswrapper[4901]: E0202 10:49:13.683148 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5q92h_openshift-multus(19eb421a-49aa-4cde-ae5e-3aba70ee67f4)\"" pod="openshift-multus/multus-5q92h" podUID="19eb421a-49aa-4cde-ae5e-3aba70ee67f4" Feb 02 10:49:26 crc kubenswrapper[4901]: I0202 10:49:26.676945 4901 scope.go:117] "RemoveContainer" containerID="6717e66ee49c9fe7f861650758568fc05bf46f663523f267fbfe55430970f177" Feb 02 10:49:27 crc kubenswrapper[4901]: I0202 10:49:27.332434 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5q92h_19eb421a-49aa-4cde-ae5e-3aba70ee67f4/kube-multus/2.log" Feb 02 10:49:27 crc kubenswrapper[4901]: I0202 10:49:27.333039 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5q92h" event={"ID":"19eb421a-49aa-4cde-ae5e-3aba70ee67f4","Type":"ContainerStarted","Data":"4aa86487d4c7f1fac66d591b862c2f2c6257b02b8b5d0c6cd7f1194e0534c058"} Feb 02 10:49:31 crc kubenswrapper[4901]: I0202 10:49:31.788250 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nmpn5" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.733267 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf"] Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.734821 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.737160 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.742213 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf"] Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.799621 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.799709 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bw9v\" (UniqueName: \"kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.799912 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.900985 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.901351 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.901509 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bw9v\" (UniqueName: \"kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.901739 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.901968 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:37 crc kubenswrapper[4901]: I0202 10:49:37.919456 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bw9v\" (UniqueName: \"kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:38 crc kubenswrapper[4901]: I0202 10:49:38.052390 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:38 crc kubenswrapper[4901]: I0202 10:49:38.305246 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf"] Feb 02 10:49:38 crc kubenswrapper[4901]: W0202 10:49:38.309802 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635aac29_bf3a_4517_bde5_4dd65f084a22.slice/crio-e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a WatchSource:0}: Error finding container e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a: Status 404 returned error can't find the container with id e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a Feb 02 10:49:38 crc kubenswrapper[4901]: I0202 10:49:38.405052 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" event={"ID":"635aac29-bf3a-4517-bde5-4dd65f084a22","Type":"ContainerStarted","Data":"e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a"} Feb 02 10:49:39 crc kubenswrapper[4901]: I0202 10:49:39.415398 4901 generic.go:334] "Generic (PLEG): container finished" podID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerID="421a9c01c869e82b211c29ae6b7420a8524c7b4e71fb890aa9eef8bdbe4d0af2" exitCode=0 Feb 02 10:49:39 crc kubenswrapper[4901]: I0202 10:49:39.415478 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" event={"ID":"635aac29-bf3a-4517-bde5-4dd65f084a22","Type":"ContainerDied","Data":"421a9c01c869e82b211c29ae6b7420a8524c7b4e71fb890aa9eef8bdbe4d0af2"} Feb 02 10:49:41 crc kubenswrapper[4901]: I0202 10:49:41.430434 4901 generic.go:334] "Generic (PLEG): container finished" podID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerID="0bae65e4a2bcf8fed941d1fa3057bda0e52d7d8a8f748eae8ea3497d7733a94e" exitCode=0 Feb 02 10:49:41 crc kubenswrapper[4901]: I0202 10:49:41.430525 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" event={"ID":"635aac29-bf3a-4517-bde5-4dd65f084a22","Type":"ContainerDied","Data":"0bae65e4a2bcf8fed941d1fa3057bda0e52d7d8a8f748eae8ea3497d7733a94e"} Feb 02 10:49:42 crc kubenswrapper[4901]: I0202 10:49:42.442676 4901 generic.go:334] "Generic (PLEG): container finished" podID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerID="3182eb8ef17b0878cf4609cb13c615da329351a653e352c12f6ee47a2749f01a" exitCode=0 Feb 02 10:49:42 crc kubenswrapper[4901]: I0202 10:49:42.442759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" event={"ID":"635aac29-bf3a-4517-bde5-4dd65f084a22","Type":"ContainerDied","Data":"3182eb8ef17b0878cf4609cb13c615da329351a653e352c12f6ee47a2749f01a"} Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.746214 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.883372 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bw9v\" (UniqueName: \"kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v\") pod \"635aac29-bf3a-4517-bde5-4dd65f084a22\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.883544 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util\") pod \"635aac29-bf3a-4517-bde5-4dd65f084a22\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.883801 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle\") pod \"635aac29-bf3a-4517-bde5-4dd65f084a22\" (UID: \"635aac29-bf3a-4517-bde5-4dd65f084a22\") " Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.884639 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle" (OuterVolumeSpecName: "bundle") pod "635aac29-bf3a-4517-bde5-4dd65f084a22" (UID: "635aac29-bf3a-4517-bde5-4dd65f084a22"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.890910 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v" (OuterVolumeSpecName: "kube-api-access-4bw9v") pod "635aac29-bf3a-4517-bde5-4dd65f084a22" (UID: "635aac29-bf3a-4517-bde5-4dd65f084a22"). InnerVolumeSpecName "kube-api-access-4bw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.984710 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bw9v\" (UniqueName: \"kubernetes.io/projected/635aac29-bf3a-4517-bde5-4dd65f084a22-kube-api-access-4bw9v\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:43 crc kubenswrapper[4901]: I0202 10:49:43.985020 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:44 crc kubenswrapper[4901]: I0202 10:49:44.222070 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util" (OuterVolumeSpecName: "util") pod "635aac29-bf3a-4517-bde5-4dd65f084a22" (UID: "635aac29-bf3a-4517-bde5-4dd65f084a22"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:49:44 crc kubenswrapper[4901]: I0202 10:49:44.289407 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/635aac29-bf3a-4517-bde5-4dd65f084a22-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:44 crc kubenswrapper[4901]: I0202 10:49:44.456855 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" event={"ID":"635aac29-bf3a-4517-bde5-4dd65f084a22","Type":"ContainerDied","Data":"e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a"} Feb 02 10:49:44 crc kubenswrapper[4901]: I0202 10:49:44.456941 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ce74e05de3a7a2a08de99d6e9454faeb6ae976425a7b68199b12b9e8bc601a" Feb 02 10:49:44 crc kubenswrapper[4901]: I0202 10:49:44.456895 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.161654 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-hqzcm"] Feb 02 10:49:46 crc kubenswrapper[4901]: E0202 10:49:46.161902 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="util" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.161917 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="util" Feb 02 10:49:46 crc kubenswrapper[4901]: E0202 10:49:46.161929 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="pull" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.161937 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="pull" Feb 02 10:49:46 crc kubenswrapper[4901]: E0202 10:49:46.161946 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="extract" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.161957 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="extract" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.162105 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="635aac29-bf3a-4517-bde5-4dd65f084a22" containerName="extract" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.162590 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.164982 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.165177 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.165299 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ts7zx" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.172226 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-hqzcm"] Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.214095 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx28j\" (UniqueName: \"kubernetes.io/projected/206aa20d-3302-4ae4-a457-6e89f29a4bf7-kube-api-access-tx28j\") pod \"nmstate-operator-646758c888-hqzcm\" (UID: \"206aa20d-3302-4ae4-a457-6e89f29a4bf7\") " pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.315331 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx28j\" (UniqueName: \"kubernetes.io/projected/206aa20d-3302-4ae4-a457-6e89f29a4bf7-kube-api-access-tx28j\") pod \"nmstate-operator-646758c888-hqzcm\" (UID: \"206aa20d-3302-4ae4-a457-6e89f29a4bf7\") " pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.336611 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx28j\" (UniqueName: \"kubernetes.io/projected/206aa20d-3302-4ae4-a457-6e89f29a4bf7-kube-api-access-tx28j\") pod \"nmstate-operator-646758c888-hqzcm\" (UID: \"206aa20d-3302-4ae4-a457-6e89f29a4bf7\") " pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.480424 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" Feb 02 10:49:46 crc kubenswrapper[4901]: I0202 10:49:46.676927 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-hqzcm"] Feb 02 10:49:47 crc kubenswrapper[4901]: I0202 10:49:47.478529 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" event={"ID":"206aa20d-3302-4ae4-a457-6e89f29a4bf7","Type":"ContainerStarted","Data":"a9220a6650f8cc2cc020fec775fcd8953dca42926e365389db192bfb5d2320ea"} Feb 02 10:49:49 crc kubenswrapper[4901]: I0202 10:49:49.491639 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" event={"ID":"206aa20d-3302-4ae4-a457-6e89f29a4bf7","Type":"ContainerStarted","Data":"9171636da092a0082553e5c40f0fe2afa5b8c857207221c2fd066b7ba794f1ea"} Feb 02 10:49:49 crc kubenswrapper[4901]: I0202 10:49:49.515944 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-hqzcm" podStartSLOduration=1.417826776 podStartE2EDuration="3.515920912s" podCreationTimestamp="2026-02-02 10:49:46 +0000 UTC" firstStartedPulling="2026-02-02 10:49:46.688592355 +0000 UTC m=+673.706932451" lastFinishedPulling="2026-02-02 10:49:48.786686491 +0000 UTC m=+675.805026587" observedRunningTime="2026-02-02 10:49:49.511725206 +0000 UTC m=+676.530065302" watchObservedRunningTime="2026-02-02 10:49:49.515920912 +0000 UTC m=+676.534261018" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.470502 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-4lrf6"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.471633 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.476339 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-27zz9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.484889 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-4lrf6"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.488702 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.489331 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.498838 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ckwfx"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.499520 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.500008 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.513204 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.569198 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8vvr\" (UniqueName: \"kubernetes.io/projected/26f249f1-8c84-49ab-b583-948e30dc04f3-kube-api-access-x8vvr\") pod \"nmstate-metrics-54757c584b-4lrf6\" (UID: \"26f249f1-8c84-49ab-b583-948e30dc04f3\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.606295 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.606984 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.612914 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.613051 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5fhvs" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.613173 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.627387 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.670557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-dbus-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.670640 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5f5j\" (UniqueName: \"kubernetes.io/projected/480309f7-ab75-461f-a7bf-075ad02326ca-kube-api-access-x5f5j\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.670721 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.670992 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-nmstate-lock\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.671072 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc62q\" (UniqueName: \"kubernetes.io/projected/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-kube-api-access-qc62q\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.671216 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8vvr\" (UniqueName: \"kubernetes.io/projected/26f249f1-8c84-49ab-b583-948e30dc04f3-kube-api-access-x8vvr\") pod \"nmstate-metrics-54757c584b-4lrf6\" (UID: \"26f249f1-8c84-49ab-b583-948e30dc04f3\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.671291 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-ovs-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.696774 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8vvr\" (UniqueName: \"kubernetes.io/projected/26f249f1-8c84-49ab-b583-948e30dc04f3-kube-api-access-x8vvr\") pod \"nmstate-metrics-54757c584b-4lrf6\" (UID: \"26f249f1-8c84-49ab-b583-948e30dc04f3\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773081 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-nmstate-lock\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773455 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc62q\" (UniqueName: \"kubernetes.io/projected/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-kube-api-access-qc62q\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773492 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/037e715d-ec59-4109-9def-cd55e556e9f4-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773519 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzh2v\" (UniqueName: \"kubernetes.io/projected/037e715d-ec59-4109-9def-cd55e556e9f4-kube-api-access-xzh2v\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773599 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-ovs-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773219 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-nmstate-lock\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773661 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-ovs-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773781 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-dbus-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773811 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5f5j\" (UniqueName: \"kubernetes.io/projected/480309f7-ab75-461f-a7bf-075ad02326ca-kube-api-access-x5f5j\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773875 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.773907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.774120 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-dbus-socket\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: E0202 10:49:50.774180 4901 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 02 10:49:50 crc kubenswrapper[4901]: E0202 10:49:50.774222 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair podName:480309f7-ab75-461f-a7bf-075ad02326ca nodeName:}" failed. No retries permitted until 2026-02-02 10:49:51.274207558 +0000 UTC m=+678.292547654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-7w8p9" (UID: "480309f7-ab75-461f-a7bf-075ad02326ca") : secret "openshift-nmstate-webhook" not found Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.796344 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc62q\" (UniqueName: \"kubernetes.io/projected/9733ec0d-51fa-4932-b0b9-42c3e54bc39e-kube-api-access-qc62q\") pod \"nmstate-handler-ckwfx\" (UID: \"9733ec0d-51fa-4932-b0b9-42c3e54bc39e\") " pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.799304 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84df885bf7-76l9r"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.800275 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.800470 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5f5j\" (UniqueName: \"kubernetes.io/projected/480309f7-ab75-461f-a7bf-075ad02326ca-kube-api-access-x5f5j\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.812995 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84df885bf7-76l9r"] Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.822918 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.853037 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.874837 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.874966 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzh2v\" (UniqueName: \"kubernetes.io/projected/037e715d-ec59-4109-9def-cd55e556e9f4-kube-api-access-xzh2v\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.875043 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/037e715d-ec59-4109-9def-cd55e556e9f4-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.876330 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/037e715d-ec59-4109-9def-cd55e556e9f4-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: E0202 10:49:50.876485 4901 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 02 10:49:50 crc kubenswrapper[4901]: E0202 10:49:50.876617 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert podName:037e715d-ec59-4109-9def-cd55e556e9f4 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:51.37659936 +0000 UTC m=+678.394939456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-sdv4q" (UID: "037e715d-ec59-4109-9def-cd55e556e9f4") : secret "plugin-serving-cert" not found Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.897321 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzh2v\" (UniqueName: \"kubernetes.io/projected/037e715d-ec59-4109-9def-cd55e556e9f4-kube-api-access-xzh2v\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-trusted-ca-bundle\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976491 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-oauth-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976702 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-service-ca\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8dch\" (UniqueName: \"kubernetes.io/projected/5e0c51fc-b37a-4faf-b8c0-d661761e067b-kube-api-access-f8dch\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976789 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976869 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:50 crc kubenswrapper[4901]: I0202 10:49:50.976891 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-oauth-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.022891 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-4lrf6"] Feb 02 10:49:51 crc kubenswrapper[4901]: W0202 10:49:51.034814 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f249f1_8c84_49ab_b583_948e30dc04f3.slice/crio-5fac858b2a57dfa8718381d2fead365e119536c4140daab97e197d8cae17b034 WatchSource:0}: Error finding container 5fac858b2a57dfa8718381d2fead365e119536c4140daab97e197d8cae17b034: Status 404 returned error can't find the container with id 5fac858b2a57dfa8718381d2fead365e119536c4140daab97e197d8cae17b034 Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079090 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-trusted-ca-bundle\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079153 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-oauth-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-service-ca\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079265 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8dch\" (UniqueName: \"kubernetes.io/projected/5e0c51fc-b37a-4faf-b8c0-d661761e067b-kube-api-access-f8dch\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079292 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.079356 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-oauth-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.080180 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-oauth-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.080181 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-service-ca\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.080293 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.080434 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0c51fc-b37a-4faf-b8c0-d661761e067b-trusted-ca-bundle\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.084471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-serving-cert\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.085197 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e0c51fc-b37a-4faf-b8c0-d661761e067b-console-oauth-config\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.095982 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8dch\" (UniqueName: \"kubernetes.io/projected/5e0c51fc-b37a-4faf-b8c0-d661761e067b-kube-api-access-f8dch\") pod \"console-84df885bf7-76l9r\" (UID: \"5e0c51fc-b37a-4faf-b8c0-d661761e067b\") " pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.200861 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.282019 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.285414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/480309f7-ab75-461f-a7bf-075ad02326ca-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-7w8p9\" (UID: \"480309f7-ab75-461f-a7bf-075ad02326ca\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.383645 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.387066 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/037e715d-ec59-4109-9def-cd55e556e9f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-sdv4q\" (UID: \"037e715d-ec59-4109-9def-cd55e556e9f4\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.433259 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.523430 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" event={"ID":"26f249f1-8c84-49ab-b583-948e30dc04f3","Type":"ContainerStarted","Data":"5fac858b2a57dfa8718381d2fead365e119536c4140daab97e197d8cae17b034"} Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.524810 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.526364 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ckwfx" event={"ID":"9733ec0d-51fa-4932-b0b9-42c3e54bc39e","Type":"ContainerStarted","Data":"9d4c5447645b5e627f2b71398961e9af8809a25af56e916282d93dda5059a8a4"} Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.624303 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84df885bf7-76l9r"] Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.722391 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q"] Feb 02 10:49:51 crc kubenswrapper[4901]: I0202 10:49:51.819810 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9"] Feb 02 10:49:52 crc kubenswrapper[4901]: I0202 10:49:52.538248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" event={"ID":"480309f7-ab75-461f-a7bf-075ad02326ca","Type":"ContainerStarted","Data":"c6b173deec00c83b13fe5bf5bdd9a583b12f336ecce230c5dc2d2e24fe239f36"} Feb 02 10:49:52 crc kubenswrapper[4901]: I0202 10:49:52.542449 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df885bf7-76l9r" event={"ID":"5e0c51fc-b37a-4faf-b8c0-d661761e067b","Type":"ContainerStarted","Data":"7d6880448525e51dd9adc764b02656347668ea6d79205367ef7c904c169ee8ad"} Feb 02 10:49:52 crc kubenswrapper[4901]: I0202 10:49:52.542532 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df885bf7-76l9r" event={"ID":"5e0c51fc-b37a-4faf-b8c0-d661761e067b","Type":"ContainerStarted","Data":"282e1101ec67f018bcd5b17c1564f7b6d7f08f40c62de3b57e7422ddbdcf0461"} Feb 02 10:49:52 crc kubenswrapper[4901]: I0202 10:49:52.544328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" event={"ID":"037e715d-ec59-4109-9def-cd55e556e9f4","Type":"ContainerStarted","Data":"f591386273ab127c8fa47bccb86c38a20286f5f3c62ea3923b5866e72fabc63e"} Feb 02 10:49:52 crc kubenswrapper[4901]: I0202 10:49:52.580590 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84df885bf7-76l9r" podStartSLOduration=2.580526636 podStartE2EDuration="2.580526636s" podCreationTimestamp="2026-02-02 10:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:49:52.571183786 +0000 UTC m=+679.589523892" watchObservedRunningTime="2026-02-02 10:49:52.580526636 +0000 UTC m=+679.598866772" Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.558409 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" event={"ID":"480309f7-ab75-461f-a7bf-075ad02326ca","Type":"ContainerStarted","Data":"c43c68b00b109693a3e1a1ac314c9a3701b36450cc9ce653d8af1bfca0b75905"} Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.559547 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.560694 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" event={"ID":"26f249f1-8c84-49ab-b583-948e30dc04f3","Type":"ContainerStarted","Data":"058ba4abb889ab6e21fd5200f31b1dadfcec858b7f7e2a1c902f6d4dc3b03100"} Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.561610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ckwfx" event={"ID":"9733ec0d-51fa-4932-b0b9-42c3e54bc39e","Type":"ContainerStarted","Data":"9b1f725c4077f31998e4e65f35d0d15346d66a82048a8f0db1f785382bcba845"} Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.561971 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.575419 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" podStartSLOduration=2.719633587 podStartE2EDuration="4.575402226s" podCreationTimestamp="2026-02-02 10:49:50 +0000 UTC" firstStartedPulling="2026-02-02 10:49:51.823338439 +0000 UTC m=+678.841678545" lastFinishedPulling="2026-02-02 10:49:53.679107088 +0000 UTC m=+680.697447184" observedRunningTime="2026-02-02 10:49:54.573689029 +0000 UTC m=+681.592029125" watchObservedRunningTime="2026-02-02 10:49:54.575402226 +0000 UTC m=+681.593742322" Feb 02 10:49:54 crc kubenswrapper[4901]: I0202 10:49:54.593866 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ckwfx" podStartSLOduration=1.793068259 podStartE2EDuration="4.593830408s" podCreationTimestamp="2026-02-02 10:49:50 +0000 UTC" firstStartedPulling="2026-02-02 10:49:50.876431636 +0000 UTC m=+677.894771732" lastFinishedPulling="2026-02-02 10:49:53.677193785 +0000 UTC m=+680.695533881" observedRunningTime="2026-02-02 10:49:54.592645855 +0000 UTC m=+681.610985971" watchObservedRunningTime="2026-02-02 10:49:54.593830408 +0000 UTC m=+681.612170514" Feb 02 10:49:55 crc kubenswrapper[4901]: I0202 10:49:55.573922 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" event={"ID":"037e715d-ec59-4109-9def-cd55e556e9f4","Type":"ContainerStarted","Data":"0698347e5f7013aa3fd4ac4bb90265f38330f35d00224948bdb3d66d1ef3a031"} Feb 02 10:49:55 crc kubenswrapper[4901]: I0202 10:49:55.597407 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-sdv4q" podStartSLOduration=2.60879996 podStartE2EDuration="5.597378613s" podCreationTimestamp="2026-02-02 10:49:50 +0000 UTC" firstStartedPulling="2026-02-02 10:49:51.727656253 +0000 UTC m=+678.745996349" lastFinishedPulling="2026-02-02 10:49:54.716234906 +0000 UTC m=+681.734575002" observedRunningTime="2026-02-02 10:49:55.5871692 +0000 UTC m=+682.605509296" watchObservedRunningTime="2026-02-02 10:49:55.597378613 +0000 UTC m=+682.615718729" Feb 02 10:49:56 crc kubenswrapper[4901]: I0202 10:49:56.581014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" event={"ID":"26f249f1-8c84-49ab-b583-948e30dc04f3","Type":"ContainerStarted","Data":"5519c96711a80431288782027c120bdc5f43c816f73e47051cd924fbb7635b93"} Feb 02 10:50:00 crc kubenswrapper[4901]: I0202 10:50:00.875514 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ckwfx" Feb 02 10:50:00 crc kubenswrapper[4901]: I0202 10:50:00.892815 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-4lrf6" podStartSLOduration=5.990859715 podStartE2EDuration="10.892797756s" podCreationTimestamp="2026-02-02 10:49:50 +0000 UTC" firstStartedPulling="2026-02-02 10:49:51.037354633 +0000 UTC m=+678.055694729" lastFinishedPulling="2026-02-02 10:49:55.939292674 +0000 UTC m=+682.957632770" observedRunningTime="2026-02-02 10:49:56.606609426 +0000 UTC m=+683.624949542" watchObservedRunningTime="2026-02-02 10:50:00.892797756 +0000 UTC m=+687.911137852" Feb 02 10:50:01 crc kubenswrapper[4901]: I0202 10:50:01.201223 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:50:01 crc kubenswrapper[4901]: I0202 10:50:01.201643 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:50:01 crc kubenswrapper[4901]: I0202 10:50:01.205612 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:50:01 crc kubenswrapper[4901]: I0202 10:50:01.613905 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84df885bf7-76l9r" Feb 02 10:50:01 crc kubenswrapper[4901]: I0202 10:50:01.699807 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:50:11 crc kubenswrapper[4901]: I0202 10:50:11.440758 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-7w8p9" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.431295 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj"] Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.433411 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.435631 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.456036 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj"] Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.523515 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7wn\" (UniqueName: \"kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.523682 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.523767 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.625829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7wn\" (UniqueName: \"kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.625943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.626018 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.626934 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.627077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.666775 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7wn\" (UniqueName: \"kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:25 crc kubenswrapper[4901]: I0202 10:50:25.759145 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:26 crc kubenswrapper[4901]: I0202 10:50:26.020527 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj"] Feb 02 10:50:26 crc kubenswrapper[4901]: I0202 10:50:26.740708 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wb2m4" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerName="console" containerID="cri-o://f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f" gracePeriod=15 Feb 02 10:50:26 crc kubenswrapper[4901]: I0202 10:50:26.809884 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerID="a598ac72c1d2d1d2622581cf7c41926fe87212e62f14f210c08e1767fda63c2a" exitCode=0 Feb 02 10:50:26 crc kubenswrapper[4901]: I0202 10:50:26.809976 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" event={"ID":"2b710f66-8d54-4133-ae8f-9a05af592ada","Type":"ContainerDied","Data":"a598ac72c1d2d1d2622581cf7c41926fe87212e62f14f210c08e1767fda63c2a"} Feb 02 10:50:26 crc kubenswrapper[4901]: I0202 10:50:26.810038 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" event={"ID":"2b710f66-8d54-4133-ae8f-9a05af592ada","Type":"ContainerStarted","Data":"e5dbc78e329f4a33e3467a9149e72645a9d145d64cb318a36bf31c06b2d9a598"} Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.179195 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wb2m4_c4c3efa1-9114-4b9b-be8b-045c2c4d7928/console/0.log" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.179274 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256557 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256702 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b29j\" (UniqueName: \"kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256741 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256780 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256811 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256841 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.256892 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config\") pod \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\" (UID: \"c4c3efa1-9114-4b9b-be8b-045c2c4d7928\") " Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.257797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config" (OuterVolumeSpecName: "console-config") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.257909 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.257952 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca" (OuterVolumeSpecName: "service-ca") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.258404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.263129 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j" (OuterVolumeSpecName: "kube-api-access-9b29j") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "kube-api-access-9b29j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.263473 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.265038 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c4c3efa1-9114-4b9b-be8b-045c2c4d7928" (UID: "c4c3efa1-9114-4b9b-be8b-045c2c4d7928"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358498 4901 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358541 4901 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358556 4901 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358598 4901 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358645 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b29j\" (UniqueName: \"kubernetes.io/projected/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-kube-api-access-9b29j\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358658 4901 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.358671 4901 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4c3efa1-9114-4b9b-be8b-045c2c4d7928-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.818835 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wb2m4_c4c3efa1-9114-4b9b-be8b-045c2c4d7928/console/0.log" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.819327 4901 generic.go:334] "Generic (PLEG): container finished" podID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerID="f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f" exitCode=2 Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.819362 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wb2m4" event={"ID":"c4c3efa1-9114-4b9b-be8b-045c2c4d7928","Type":"ContainerDied","Data":"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f"} Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.819395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wb2m4" event={"ID":"c4c3efa1-9114-4b9b-be8b-045c2c4d7928","Type":"ContainerDied","Data":"de6d26fbbac9720279ab508d5b64034d462c7d80911d741857e8efd3a2caf100"} Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.819415 4901 scope.go:117] "RemoveContainer" containerID="f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.819537 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wb2m4" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.846089 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.853124 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wb2m4"] Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.917463 4901 scope.go:117] "RemoveContainer" containerID="f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f" Feb 02 10:50:27 crc kubenswrapper[4901]: E0202 10:50:27.918415 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f\": container with ID starting with f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f not found: ID does not exist" containerID="f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f" Feb 02 10:50:27 crc kubenswrapper[4901]: I0202 10:50:27.918483 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f"} err="failed to get container status \"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f\": rpc error: code = NotFound desc = could not find container \"f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f\": container with ID starting with f1975159172faeec6e743cde518adc1444c96cf146208514a3c4ed855977e73f not found: ID does not exist" Feb 02 10:50:28 crc kubenswrapper[4901]: I0202 10:50:28.831916 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerID="86c91c54bc12a3212179c536031f5023a1feb305bd19fbf4ad9877fa53afc9b8" exitCode=0 Feb 02 10:50:28 crc kubenswrapper[4901]: I0202 10:50:28.831988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" event={"ID":"2b710f66-8d54-4133-ae8f-9a05af592ada","Type":"ContainerDied","Data":"86c91c54bc12a3212179c536031f5023a1feb305bd19fbf4ad9877fa53afc9b8"} Feb 02 10:50:29 crc kubenswrapper[4901]: I0202 10:50:29.685759 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" path="/var/lib/kubelet/pods/c4c3efa1-9114-4b9b-be8b-045c2c4d7928/volumes" Feb 02 10:50:29 crc kubenswrapper[4901]: I0202 10:50:29.842658 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerID="da0ab2afb011c7ad7e8b3398902534680e15d227f84c0280a22d633dd0ea0d45" exitCode=0 Feb 02 10:50:29 crc kubenswrapper[4901]: I0202 10:50:29.842690 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" event={"ID":"2b710f66-8d54-4133-ae8f-9a05af592ada","Type":"ContainerDied","Data":"da0ab2afb011c7ad7e8b3398902534680e15d227f84c0280a22d633dd0ea0d45"} Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.171732 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.317467 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f7wn\" (UniqueName: \"kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn\") pod \"2b710f66-8d54-4133-ae8f-9a05af592ada\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.317682 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle\") pod \"2b710f66-8d54-4133-ae8f-9a05af592ada\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.317753 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util\") pod \"2b710f66-8d54-4133-ae8f-9a05af592ada\" (UID: \"2b710f66-8d54-4133-ae8f-9a05af592ada\") " Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.319249 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle" (OuterVolumeSpecName: "bundle") pod "2b710f66-8d54-4133-ae8f-9a05af592ada" (UID: "2b710f66-8d54-4133-ae8f-9a05af592ada"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.331583 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn" (OuterVolumeSpecName: "kube-api-access-8f7wn") pod "2b710f66-8d54-4133-ae8f-9a05af592ada" (UID: "2b710f66-8d54-4133-ae8f-9a05af592ada"). InnerVolumeSpecName "kube-api-access-8f7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.335547 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util" (OuterVolumeSpecName: "util") pod "2b710f66-8d54-4133-ae8f-9a05af592ada" (UID: "2b710f66-8d54-4133-ae8f-9a05af592ada"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.420272 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.420335 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b710f66-8d54-4133-ae8f-9a05af592ada-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.420360 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f7wn\" (UniqueName: \"kubernetes.io/projected/2b710f66-8d54-4133-ae8f-9a05af592ada-kube-api-access-8f7wn\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.860797 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" event={"ID":"2b710f66-8d54-4133-ae8f-9a05af592ada","Type":"ContainerDied","Data":"e5dbc78e329f4a33e3467a9149e72645a9d145d64cb318a36bf31c06b2d9a598"} Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.860850 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5dbc78e329f4a33e3467a9149e72645a9d145d64cb318a36bf31c06b2d9a598" Feb 02 10:50:31 crc kubenswrapper[4901]: I0202 10:50:31.860913 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.269709 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt"] Feb 02 10:50:41 crc kubenswrapper[4901]: E0202 10:50:41.270436 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="util" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270450 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="util" Feb 02 10:50:41 crc kubenswrapper[4901]: E0202 10:50:41.270459 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerName="console" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270465 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerName="console" Feb 02 10:50:41 crc kubenswrapper[4901]: E0202 10:50:41.270475 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="pull" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270481 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="pull" Feb 02 10:50:41 crc kubenswrapper[4901]: E0202 10:50:41.270497 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="extract" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270502 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="extract" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270649 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b710f66-8d54-4133-ae8f-9a05af592ada" containerName="extract" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.270671 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c3efa1-9114-4b9b-be8b-045c2c4d7928" containerName="console" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.271089 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.274208 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.274491 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.274801 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.274994 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-f6x7b" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.275979 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.288291 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt"] Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.360734 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-apiservice-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.360826 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrt9p\" (UniqueName: \"kubernetes.io/projected/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-kube-api-access-nrt9p\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.360941 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-webhook-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.462643 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-webhook-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.463009 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-apiservice-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.463091 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrt9p\" (UniqueName: \"kubernetes.io/projected/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-kube-api-access-nrt9p\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.471391 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-webhook-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.478598 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-apiservice-cert\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.484927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrt9p\" (UniqueName: \"kubernetes.io/projected/9dcb06dd-41b2-459b-9fe0-b55fc062f05c-kube-api-access-nrt9p\") pod \"metallb-operator-controller-manager-6fc654b48b-g84wt\" (UID: \"9dcb06dd-41b2-459b-9fe0-b55fc062f05c\") " pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.593605 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.623230 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt"] Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.623942 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.625728 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.627026 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-rv8qg" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.627691 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.643257 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt"] Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.776980 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25f6\" (UniqueName: \"kubernetes.io/projected/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-kube-api-access-w25f6\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.777056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-webhook-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.777104 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-apiservice-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.879439 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w25f6\" (UniqueName: \"kubernetes.io/projected/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-kube-api-access-w25f6\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.879867 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-webhook-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.879897 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-apiservice-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.888405 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-apiservice-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.902234 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-webhook-cert\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.907830 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt"] Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.923501 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25f6\" (UniqueName: \"kubernetes.io/projected/a5d5e21a-619a-4ae6-b36e-bb61a555a29b-kube-api-access-w25f6\") pod \"metallb-operator-webhook-server-8c4f569b7-2bbgt\" (UID: \"a5d5e21a-619a-4ae6-b36e-bb61a555a29b\") " pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.938790 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" event={"ID":"9dcb06dd-41b2-459b-9fe0-b55fc062f05c","Type":"ContainerStarted","Data":"daaaf98a58ec3d38ad29c1edf745b273b87dac3ed7d703506587673e4ea7d8f6"} Feb 02 10:50:41 crc kubenswrapper[4901]: I0202 10:50:41.990335 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:42 crc kubenswrapper[4901]: I0202 10:50:42.234970 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt"] Feb 02 10:50:42 crc kubenswrapper[4901]: W0202 10:50:42.240504 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5d5e21a_619a_4ae6_b36e_bb61a555a29b.slice/crio-f23bb024ed89ac67ac98f62a617b61b5c4b9b01cf9dce22b3a4f02ebe9341c0b WatchSource:0}: Error finding container f23bb024ed89ac67ac98f62a617b61b5c4b9b01cf9dce22b3a4f02ebe9341c0b: Status 404 returned error can't find the container with id f23bb024ed89ac67ac98f62a617b61b5c4b9b01cf9dce22b3a4f02ebe9341c0b Feb 02 10:50:42 crc kubenswrapper[4901]: I0202 10:50:42.948850 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" event={"ID":"a5d5e21a-619a-4ae6-b36e-bb61a555a29b","Type":"ContainerStarted","Data":"f23bb024ed89ac67ac98f62a617b61b5c4b9b01cf9dce22b3a4f02ebe9341c0b"} Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.028868 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" event={"ID":"9dcb06dd-41b2-459b-9fe0-b55fc062f05c","Type":"ContainerStarted","Data":"0ff5af44ffdec93d9bde85c9bb83981bc540f6ff418a8b272661d55eb8948536"} Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.029375 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.031075 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" event={"ID":"a5d5e21a-619a-4ae6-b36e-bb61a555a29b","Type":"ContainerStarted","Data":"6227dbe91d947451da17fa8e386aa086a34a43ff7c7085003b653ee442ebeb28"} Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.031206 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.048697 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" podStartSLOduration=1.314496716 podStartE2EDuration="6.048677298s" podCreationTimestamp="2026-02-02 10:50:41 +0000 UTC" firstStartedPulling="2026-02-02 10:50:41.930250155 +0000 UTC m=+728.948590251" lastFinishedPulling="2026-02-02 10:50:46.664430727 +0000 UTC m=+733.682770833" observedRunningTime="2026-02-02 10:50:47.044950342 +0000 UTC m=+734.063290448" watchObservedRunningTime="2026-02-02 10:50:47.048677298 +0000 UTC m=+734.067017394" Feb 02 10:50:47 crc kubenswrapper[4901]: I0202 10:50:47.060586 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" podStartSLOduration=1.620639671 podStartE2EDuration="6.060543492s" podCreationTimestamp="2026-02-02 10:50:41 +0000 UTC" firstStartedPulling="2026-02-02 10:50:42.243977895 +0000 UTC m=+729.262317991" lastFinishedPulling="2026-02-02 10:50:46.683881696 +0000 UTC m=+733.702221812" observedRunningTime="2026-02-02 10:50:47.058885019 +0000 UTC m=+734.077225125" watchObservedRunningTime="2026-02-02 10:50:47.060543492 +0000 UTC m=+734.078883588" Feb 02 10:51:01 crc kubenswrapper[4901]: I0202 10:51:01.998297 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-8c4f569b7-2bbgt" Feb 02 10:51:10 crc kubenswrapper[4901]: I0202 10:51:10.537774 4901 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:51:21 crc kubenswrapper[4901]: I0202 10:51:21.596354 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6fc654b48b-g84wt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.381218 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.381943 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.384309 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.384418 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tc2dh" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.398239 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.406818 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9h8s\" (UniqueName: \"kubernetes.io/projected/7389b305-4824-4f3b-820e-9466214ea9b1-kube-api-access-w9h8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.406887 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7389b305-4824-4f3b-820e-9466214ea9b1-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.411573 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5c4zt"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.413711 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.415532 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.415601 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.488579 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dppcw"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.489558 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.495025 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-5ds9w"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.496066 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.497837 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.497994 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.497995 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.498130 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.498212 4901 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dk4zj" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.507812 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.507928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9h8s\" (UniqueName: \"kubernetes.io/projected/7389b305-4824-4f3b-820e-9466214ea9b1-kube-api-access-w9h8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.507963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-conf\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.507993 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhsf\" (UniqueName: \"kubernetes.io/projected/1c6a7816-6119-496c-9298-fb8121c9e38f-kube-api-access-bnhsf\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-reloader\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/274b5333-5608-463a-a844-de0f51548386-metrics-certs\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508082 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7389b305-4824-4f3b-820e-9466214ea9b1-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508107 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-metrics\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508125 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metrics-certs\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508158 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-sockets\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508182 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgb7t\" (UniqueName: \"kubernetes.io/projected/5c595fcf-9396-4426-bff4-84cd6eda9dc5-kube-api-access-sgb7t\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508199 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metallb-excludel2\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508285 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-cert\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508384 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6xh\" (UniqueName: \"kubernetes.io/projected/274b5333-5608-463a-a844-de0f51548386-kube-api-access-wh6xh\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508423 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/274b5333-5608-463a-a844-de0f51548386-frr-startup\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.508441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.515230 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7389b305-4824-4f3b-820e-9466214ea9b1-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.519588 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-5ds9w"] Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.552202 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9h8s\" (UniqueName: \"kubernetes.io/projected/7389b305-4824-4f3b-820e-9466214ea9b1-kube-api-access-w9h8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-rr5bw\" (UID: \"7389b305-4824-4f3b-820e-9466214ea9b1\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609618 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-conf\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609643 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhsf\" (UniqueName: \"kubernetes.io/projected/1c6a7816-6119-496c-9298-fb8121c9e38f-kube-api-access-bnhsf\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609678 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-reloader\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609700 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/274b5333-5608-463a-a844-de0f51548386-metrics-certs\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: E0202 10:51:22.609709 4901 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:51:22 crc kubenswrapper[4901]: E0202 10:51:22.609790 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist podName:5c595fcf-9396-4426-bff4-84cd6eda9dc5 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:23.109770821 +0000 UTC m=+770.128110917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist") pod "speaker-dppcw" (UID: "5c595fcf-9396-4426-bff4-84cd6eda9dc5") : secret "metallb-memberlist" not found Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.609716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-metrics\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610064 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metrics-certs\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610085 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-metrics\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610161 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-sockets\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metallb-excludel2\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610225 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgb7t\" (UniqueName: \"kubernetes.io/projected/5c595fcf-9396-4426-bff4-84cd6eda9dc5-kube-api-access-sgb7t\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610247 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-cert\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610271 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-conf\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610292 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh6xh\" (UniqueName: \"kubernetes.io/projected/274b5333-5608-463a-a844-de0f51548386-kube-api-access-wh6xh\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610314 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/274b5333-5608-463a-a844-de0f51548386-frr-startup\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610330 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: E0202 10:51:22.610432 4901 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 02 10:51:22 crc kubenswrapper[4901]: E0202 10:51:22.610453 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs podName:1c6a7816-6119-496c-9298-fb8121c9e38f nodeName:}" failed. No retries permitted until 2026-02-02 10:51:23.110446228 +0000 UTC m=+770.128786324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs") pod "controller-6968d8fdc4-5ds9w" (UID: "1c6a7816-6119-496c-9298-fb8121c9e38f") : secret "controller-certs-secret" not found Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.610788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-reloader\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.611502 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/274b5333-5608-463a-a844-de0f51548386-frr-sockets\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.612367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/274b5333-5608-463a-a844-de0f51548386-frr-startup\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.612396 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metallb-excludel2\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.621070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-cert\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.622007 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-metrics-certs\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.635121 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/274b5333-5608-463a-a844-de0f51548386-metrics-certs\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.640042 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh6xh\" (UniqueName: \"kubernetes.io/projected/274b5333-5608-463a-a844-de0f51548386-kube-api-access-wh6xh\") pod \"frr-k8s-5c4zt\" (UID: \"274b5333-5608-463a-a844-de0f51548386\") " pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.650536 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhsf\" (UniqueName: \"kubernetes.io/projected/1c6a7816-6119-496c-9298-fb8121c9e38f-kube-api-access-bnhsf\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.663288 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgb7t\" (UniqueName: \"kubernetes.io/projected/5c595fcf-9396-4426-bff4-84cd6eda9dc5-kube-api-access-sgb7t\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.697115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.726113 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:22 crc kubenswrapper[4901]: I0202 10:51:22.903964 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw"] Feb 02 10:51:22 crc kubenswrapper[4901]: W0202 10:51:22.910919 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7389b305_4824_4f3b_820e_9466214ea9b1.slice/crio-b8c6ea503d6c9ccf53712723ea0f11451d51fee82571d68c19e35232a7e52430 WatchSource:0}: Error finding container b8c6ea503d6c9ccf53712723ea0f11451d51fee82571d68c19e35232a7e52430: Status 404 returned error can't find the container with id b8c6ea503d6c9ccf53712723ea0f11451d51fee82571d68c19e35232a7e52430 Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.123441 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.123496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:23 crc kubenswrapper[4901]: E0202 10:51:23.123770 4901 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:51:23 crc kubenswrapper[4901]: E0202 10:51:23.123847 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist podName:5c595fcf-9396-4426-bff4-84cd6eda9dc5 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:24.123825942 +0000 UTC m=+771.142166058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist") pod "speaker-dppcw" (UID: "5c595fcf-9396-4426-bff4-84cd6eda9dc5") : secret "metallb-memberlist" not found Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.129183 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c6a7816-6119-496c-9298-fb8121c9e38f-metrics-certs\") pod \"controller-6968d8fdc4-5ds9w\" (UID: \"1c6a7816-6119-496c-9298-fb8121c9e38f\") " pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.232691 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"c7355901532a650adfcd641a19459c4d1df1cf6c2922f19855315effb5313a08"} Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.233740 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" event={"ID":"7389b305-4824-4f3b-820e-9466214ea9b1","Type":"ContainerStarted","Data":"b8c6ea503d6c9ccf53712723ea0f11451d51fee82571d68c19e35232a7e52430"} Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.412362 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:23 crc kubenswrapper[4901]: I0202 10:51:23.630845 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-5ds9w"] Feb 02 10:51:23 crc kubenswrapper[4901]: W0202 10:51:23.638186 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6a7816_6119_496c_9298_fb8121c9e38f.slice/crio-599c7c7bfa43594447666eb1456b8fdf581c96c61ea1a034e17dbd0936713136 WatchSource:0}: Error finding container 599c7c7bfa43594447666eb1456b8fdf581c96c61ea1a034e17dbd0936713136: Status 404 returned error can't find the container with id 599c7c7bfa43594447666eb1456b8fdf581c96c61ea1a034e17dbd0936713136 Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.144946 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.151639 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5c595fcf-9396-4426-bff4-84cd6eda9dc5-memberlist\") pod \"speaker-dppcw\" (UID: \"5c595fcf-9396-4426-bff4-84cd6eda9dc5\") " pod="metallb-system/speaker-dppcw" Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.241762 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-5ds9w" event={"ID":"1c6a7816-6119-496c-9298-fb8121c9e38f","Type":"ContainerStarted","Data":"cdb1295814df754682429d2ece541e278bee967273c0f4e3caacdff68e862a51"} Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.241816 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-5ds9w" event={"ID":"1c6a7816-6119-496c-9298-fb8121c9e38f","Type":"ContainerStarted","Data":"ec1e79e14830177849b0c5429066c2b57da88c34139ba69ef59438cb74094224"} Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.241828 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-5ds9w" event={"ID":"1c6a7816-6119-496c-9298-fb8121c9e38f","Type":"ContainerStarted","Data":"599c7c7bfa43594447666eb1456b8fdf581c96c61ea1a034e17dbd0936713136"} Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.241932 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.263697 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-5ds9w" podStartSLOduration=2.263677251 podStartE2EDuration="2.263677251s" podCreationTimestamp="2026-02-02 10:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:24.261980458 +0000 UTC m=+771.280320564" watchObservedRunningTime="2026-02-02 10:51:24.263677251 +0000 UTC m=+771.282017337" Feb 02 10:51:24 crc kubenswrapper[4901]: I0202 10:51:24.304284 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dppcw" Feb 02 10:51:24 crc kubenswrapper[4901]: W0202 10:51:24.329153 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c595fcf_9396_4426_bff4_84cd6eda9dc5.slice/crio-f31422c7d393cc757a0c76a9430a0f5b0f5c3f21325619f1602b9c5e2a206d02 WatchSource:0}: Error finding container f31422c7d393cc757a0c76a9430a0f5b0f5c3f21325619f1602b9c5e2a206d02: Status 404 returned error can't find the container with id f31422c7d393cc757a0c76a9430a0f5b0f5c3f21325619f1602b9c5e2a206d02 Feb 02 10:51:25 crc kubenswrapper[4901]: I0202 10:51:25.249201 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dppcw" event={"ID":"5c595fcf-9396-4426-bff4-84cd6eda9dc5","Type":"ContainerStarted","Data":"5d0a4280043734e439da71c2c8df225ea0e62f0cd9e7c287b8d27db5c49f4513"} Feb 02 10:51:25 crc kubenswrapper[4901]: I0202 10:51:25.249901 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dppcw" event={"ID":"5c595fcf-9396-4426-bff4-84cd6eda9dc5","Type":"ContainerStarted","Data":"24bd247017655589a81950d3d81504bdded563260cc4846d9df8cd9dee7713fb"} Feb 02 10:51:25 crc kubenswrapper[4901]: I0202 10:51:25.249914 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dppcw" event={"ID":"5c595fcf-9396-4426-bff4-84cd6eda9dc5","Type":"ContainerStarted","Data":"f31422c7d393cc757a0c76a9430a0f5b0f5c3f21325619f1602b9c5e2a206d02"} Feb 02 10:51:25 crc kubenswrapper[4901]: I0202 10:51:25.250066 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dppcw" Feb 02 10:51:25 crc kubenswrapper[4901]: I0202 10:51:25.275948 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dppcw" podStartSLOduration=3.275918296 podStartE2EDuration="3.275918296s" podCreationTimestamp="2026-02-02 10:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:25.269190063 +0000 UTC m=+772.287530169" watchObservedRunningTime="2026-02-02 10:51:25.275918296 +0000 UTC m=+772.294258392" Feb 02 10:51:31 crc kubenswrapper[4901]: I0202 10:51:31.298329 4901 generic.go:334] "Generic (PLEG): container finished" podID="274b5333-5608-463a-a844-de0f51548386" containerID="d0826666f75fa3d66ecebfdb77c412cad6dc92491bf875d4200244253a24c134" exitCode=0 Feb 02 10:51:31 crc kubenswrapper[4901]: I0202 10:51:31.298498 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerDied","Data":"d0826666f75fa3d66ecebfdb77c412cad6dc92491bf875d4200244253a24c134"} Feb 02 10:51:31 crc kubenswrapper[4901]: I0202 10:51:31.301468 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" event={"ID":"7389b305-4824-4f3b-820e-9466214ea9b1","Type":"ContainerStarted","Data":"e5c4f7822f4a35a770a748be89fe1212df9cd520a11571f4a577eb5c527746a1"} Feb 02 10:51:31 crc kubenswrapper[4901]: I0202 10:51:31.301749 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:31 crc kubenswrapper[4901]: I0202 10:51:31.357072 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" podStartSLOduration=2.05088798 podStartE2EDuration="9.35702106s" podCreationTimestamp="2026-02-02 10:51:22 +0000 UTC" firstStartedPulling="2026-02-02 10:51:22.912457648 +0000 UTC m=+769.930797744" lastFinishedPulling="2026-02-02 10:51:30.218590728 +0000 UTC m=+777.236930824" observedRunningTime="2026-02-02 10:51:31.350859593 +0000 UTC m=+778.369199689" watchObservedRunningTime="2026-02-02 10:51:31.35702106 +0000 UTC m=+778.375361156" Feb 02 10:51:32 crc kubenswrapper[4901]: I0202 10:51:32.311938 4901 generic.go:334] "Generic (PLEG): container finished" podID="274b5333-5608-463a-a844-de0f51548386" containerID="93d2c5230f39d500d1c1760566616fa31bdd3fb816c3401dce2f8385a2cc53dc" exitCode=0 Feb 02 10:51:32 crc kubenswrapper[4901]: I0202 10:51:32.312102 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerDied","Data":"93d2c5230f39d500d1c1760566616fa31bdd3fb816c3401dce2f8385a2cc53dc"} Feb 02 10:51:33 crc kubenswrapper[4901]: I0202 10:51:33.320111 4901 generic.go:334] "Generic (PLEG): container finished" podID="274b5333-5608-463a-a844-de0f51548386" containerID="d75f2e976990187f2c7796f0dcd2895b3f2ec9d76c75f909472c42dd4cdb36ce" exitCode=0 Feb 02 10:51:33 crc kubenswrapper[4901]: I0202 10:51:33.320502 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerDied","Data":"d75f2e976990187f2c7796f0dcd2895b3f2ec9d76c75f909472c42dd4cdb36ce"} Feb 02 10:51:33 crc kubenswrapper[4901]: I0202 10:51:33.417206 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-5ds9w" Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.310703 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dppcw" Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.352629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"3cec33b9a00ae235c0ca454d1d704145441224a949b412059ab7a6c7850b521f"} Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.352689 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"08ac980947a8c214c727fffb590f7fc957e514d0ec796efe144a537c1ec2421f"} Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.352706 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"4a935a24f729f4f9bcda42c650b5c291030c83a823f8405fcd86a73c7152bb10"} Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.352718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"12ef874973fa1ce159fdf443a6f07724f0994806eb993f4a03fbd006662f7fb7"} Feb 02 10:51:34 crc kubenswrapper[4901]: I0202 10:51:34.352730 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"a80494950747c91f556a103c568ecf329ce25aa707ce064f1c76ecec90cdb6cb"} Feb 02 10:51:35 crc kubenswrapper[4901]: I0202 10:51:35.361139 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5c4zt" event={"ID":"274b5333-5608-463a-a844-de0f51548386","Type":"ContainerStarted","Data":"b96c3ff5b5c2bd6ecfc20adb15c7a9eedb865c1344a7dccb29a7aca1846c273f"} Feb 02 10:51:35 crc kubenswrapper[4901]: I0202 10:51:35.361393 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.454443 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5c4zt" podStartSLOduration=8.112463733 podStartE2EDuration="15.454422183s" podCreationTimestamp="2026-02-02 10:51:22 +0000 UTC" firstStartedPulling="2026-02-02 10:51:22.85683706 +0000 UTC m=+769.875177156" lastFinishedPulling="2026-02-02 10:51:30.19879551 +0000 UTC m=+777.217135606" observedRunningTime="2026-02-02 10:51:35.383429098 +0000 UTC m=+782.401769194" watchObservedRunningTime="2026-02-02 10:51:37.454422183 +0000 UTC m=+784.472762289" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.461683 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.462908 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.468403 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.468681 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-k6rqm" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.468808 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.480642 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.622047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njht\" (UniqueName: \"kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht\") pod \"openstack-operator-index-n7n97\" (UID: \"4449f0b3-37e4-4286-8d20-aea3a69757b0\") " pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.723700 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njht\" (UniqueName: \"kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht\") pod \"openstack-operator-index-n7n97\" (UID: \"4449f0b3-37e4-4286-8d20-aea3a69757b0\") " pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.727612 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.748376 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njht\" (UniqueName: \"kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht\") pod \"openstack-operator-index-n7n97\" (UID: \"4449f0b3-37e4-4286-8d20-aea3a69757b0\") " pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.786998 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.794634 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.837512 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:51:37 crc kubenswrapper[4901]: I0202 10:51:37.837629 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:51:38 crc kubenswrapper[4901]: I0202 10:51:38.034373 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:38 crc kubenswrapper[4901]: W0202 10:51:38.039843 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4449f0b3_37e4_4286_8d20_aea3a69757b0.slice/crio-4c3fdd2820778a94942977c84cb6272df0dc9aab3fe61019326451bf07a8ea27 WatchSource:0}: Error finding container 4c3fdd2820778a94942977c84cb6272df0dc9aab3fe61019326451bf07a8ea27: Status 404 returned error can't find the container with id 4c3fdd2820778a94942977c84cb6272df0dc9aab3fe61019326451bf07a8ea27 Feb 02 10:51:38 crc kubenswrapper[4901]: I0202 10:51:38.395998 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n7n97" event={"ID":"4449f0b3-37e4-4286-8d20-aea3a69757b0","Type":"ContainerStarted","Data":"4c3fdd2820778a94942977c84cb6272df0dc9aab3fe61019326451bf07a8ea27"} Feb 02 10:51:40 crc kubenswrapper[4901]: I0202 10:51:40.631329 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.282975 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t9bm2"] Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.284501 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.295635 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9bm2"] Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.376250 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg74j\" (UniqueName: \"kubernetes.io/projected/5cf937bc-31d1-4321-ad76-c0b26b86d044-kube-api-access-bg74j\") pod \"openstack-operator-index-t9bm2\" (UID: \"5cf937bc-31d1-4321-ad76-c0b26b86d044\") " pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.429055 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n7n97" event={"ID":"4449f0b3-37e4-4286-8d20-aea3a69757b0","Type":"ContainerStarted","Data":"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab"} Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.429197 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-n7n97" podUID="4449f0b3-37e4-4286-8d20-aea3a69757b0" containerName="registry-server" containerID="cri-o://13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab" gracePeriod=2 Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.455771 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-n7n97" podStartSLOduration=1.456149485 podStartE2EDuration="4.455741265s" podCreationTimestamp="2026-02-02 10:51:37 +0000 UTC" firstStartedPulling="2026-02-02 10:51:38.043026448 +0000 UTC m=+785.061366554" lastFinishedPulling="2026-02-02 10:51:41.042618238 +0000 UTC m=+788.060958334" observedRunningTime="2026-02-02 10:51:41.448257605 +0000 UTC m=+788.466597701" watchObservedRunningTime="2026-02-02 10:51:41.455741265 +0000 UTC m=+788.474081361" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.477940 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg74j\" (UniqueName: \"kubernetes.io/projected/5cf937bc-31d1-4321-ad76-c0b26b86d044-kube-api-access-bg74j\") pod \"openstack-operator-index-t9bm2\" (UID: \"5cf937bc-31d1-4321-ad76-c0b26b86d044\") " pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.496416 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg74j\" (UniqueName: \"kubernetes.io/projected/5cf937bc-31d1-4321-ad76-c0b26b86d044-kube-api-access-bg74j\") pod \"openstack-operator-index-t9bm2\" (UID: \"5cf937bc-31d1-4321-ad76-c0b26b86d044\") " pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.629030 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.746750 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.823971 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t9bm2"] Feb 02 10:51:41 crc kubenswrapper[4901]: W0202 10:51:41.830894 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cf937bc_31d1_4321_ad76_c0b26b86d044.slice/crio-5e31c96b2c0708c111d376ce91bf44ebb6bd0b285dcbea2328338d80b5a0b7d6 WatchSource:0}: Error finding container 5e31c96b2c0708c111d376ce91bf44ebb6bd0b285dcbea2328338d80b5a0b7d6: Status 404 returned error can't find the container with id 5e31c96b2c0708c111d376ce91bf44ebb6bd0b285dcbea2328338d80b5a0b7d6 Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.883791 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6njht\" (UniqueName: \"kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht\") pod \"4449f0b3-37e4-4286-8d20-aea3a69757b0\" (UID: \"4449f0b3-37e4-4286-8d20-aea3a69757b0\") " Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.892318 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht" (OuterVolumeSpecName: "kube-api-access-6njht") pod "4449f0b3-37e4-4286-8d20-aea3a69757b0" (UID: "4449f0b3-37e4-4286-8d20-aea3a69757b0"). InnerVolumeSpecName "kube-api-access-6njht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:41 crc kubenswrapper[4901]: I0202 10:51:41.985786 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6njht\" (UniqueName: \"kubernetes.io/projected/4449f0b3-37e4-4286-8d20-aea3a69757b0-kube-api-access-6njht\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.444269 4901 generic.go:334] "Generic (PLEG): container finished" podID="4449f0b3-37e4-4286-8d20-aea3a69757b0" containerID="13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab" exitCode=0 Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.444363 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n7n97" event={"ID":"4449f0b3-37e4-4286-8d20-aea3a69757b0","Type":"ContainerDied","Data":"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab"} Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.444417 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-n7n97" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.444920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-n7n97" event={"ID":"4449f0b3-37e4-4286-8d20-aea3a69757b0","Type":"ContainerDied","Data":"4c3fdd2820778a94942977c84cb6272df0dc9aab3fe61019326451bf07a8ea27"} Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.444964 4901 scope.go:117] "RemoveContainer" containerID="13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.448922 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9bm2" event={"ID":"5cf937bc-31d1-4321-ad76-c0b26b86d044","Type":"ContainerStarted","Data":"1b3bf35639c3e369549994136c066b53a2be685b797f874806ae0990818ca83c"} Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.448989 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t9bm2" event={"ID":"5cf937bc-31d1-4321-ad76-c0b26b86d044","Type":"ContainerStarted","Data":"5e31c96b2c0708c111d376ce91bf44ebb6bd0b285dcbea2328338d80b5a0b7d6"} Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.472771 4901 scope.go:117] "RemoveContainer" containerID="13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab" Feb 02 10:51:42 crc kubenswrapper[4901]: E0202 10:51:42.473645 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab\": container with ID starting with 13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab not found: ID does not exist" containerID="13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.473698 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab"} err="failed to get container status \"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab\": rpc error: code = NotFound desc = could not find container \"13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab\": container with ID starting with 13a4562ecfcce6c028fc2c00c904b0deca65cfc6427377decbdf542026e096ab not found: ID does not exist" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.499947 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t9bm2" podStartSLOduration=1.4562180869999999 podStartE2EDuration="1.499909782s" podCreationTimestamp="2026-02-02 10:51:41 +0000 UTC" firstStartedPulling="2026-02-02 10:51:41.834146795 +0000 UTC m=+788.852486891" lastFinishedPulling="2026-02-02 10:51:41.87783849 +0000 UTC m=+788.896178586" observedRunningTime="2026-02-02 10:51:42.479348222 +0000 UTC m=+789.497688358" watchObservedRunningTime="2026-02-02 10:51:42.499909782 +0000 UTC m=+789.518249918" Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.510203 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.519888 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-n7n97"] Feb 02 10:51:42 crc kubenswrapper[4901]: I0202 10:51:42.707352 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-rr5bw" Feb 02 10:51:43 crc kubenswrapper[4901]: I0202 10:51:43.684643 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4449f0b3-37e4-4286-8d20-aea3a69757b0" path="/var/lib/kubelet/pods/4449f0b3-37e4-4286-8d20-aea3a69757b0/volumes" Feb 02 10:51:51 crc kubenswrapper[4901]: I0202 10:51:51.629336 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:51 crc kubenswrapper[4901]: I0202 10:51:51.630039 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:51 crc kubenswrapper[4901]: I0202 10:51:51.673252 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:52 crc kubenswrapper[4901]: I0202 10:51:52.564626 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t9bm2" Feb 02 10:51:52 crc kubenswrapper[4901]: I0202 10:51:52.731650 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5c4zt" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.562149 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x"] Feb 02 10:51:58 crc kubenswrapper[4901]: E0202 10:51:58.563150 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4449f0b3-37e4-4286-8d20-aea3a69757b0" containerName="registry-server" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.563187 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4449f0b3-37e4-4286-8d20-aea3a69757b0" containerName="registry-server" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.563364 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4449f0b3-37e4-4286-8d20-aea3a69757b0" containerName="registry-server" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.564797 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.567425 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4mx7x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.587238 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x"] Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.662140 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.662206 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.662342 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8t6d\" (UniqueName: \"kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.764789 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.765016 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.765521 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.765754 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.767998 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8t6d\" (UniqueName: \"kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.803793 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8t6d\" (UniqueName: \"kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d\") pod \"2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:58 crc kubenswrapper[4901]: I0202 10:51:58.897496 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:51:59 crc kubenswrapper[4901]: I0202 10:51:59.370937 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x"] Feb 02 10:51:59 crc kubenswrapper[4901]: W0202 10:51:59.373871 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18261baf_6180_4a18_9250_f282d455e91a.slice/crio-f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3 WatchSource:0}: Error finding container f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3: Status 404 returned error can't find the container with id f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3 Feb 02 10:51:59 crc kubenswrapper[4901]: I0202 10:51:59.585753 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerStarted","Data":"bae5a651f5363a0827411d6d1aac689aa2a06255df4141e9ec812b331780c42d"} Feb 02 10:51:59 crc kubenswrapper[4901]: I0202 10:51:59.586485 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerStarted","Data":"f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3"} Feb 02 10:52:00 crc kubenswrapper[4901]: I0202 10:52:00.598960 4901 generic.go:334] "Generic (PLEG): container finished" podID="18261baf-6180-4a18-9250-f282d455e91a" containerID="bae5a651f5363a0827411d6d1aac689aa2a06255df4141e9ec812b331780c42d" exitCode=0 Feb 02 10:52:00 crc kubenswrapper[4901]: I0202 10:52:00.599013 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerDied","Data":"bae5a651f5363a0827411d6d1aac689aa2a06255df4141e9ec812b331780c42d"} Feb 02 10:52:01 crc kubenswrapper[4901]: I0202 10:52:01.607223 4901 generic.go:334] "Generic (PLEG): container finished" podID="18261baf-6180-4a18-9250-f282d455e91a" containerID="32879839abfa21a7431dd85708490ba5e121b8fe5e46a60249c95a0bb781422e" exitCode=0 Feb 02 10:52:01 crc kubenswrapper[4901]: I0202 10:52:01.607278 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerDied","Data":"32879839abfa21a7431dd85708490ba5e121b8fe5e46a60249c95a0bb781422e"} Feb 02 10:52:02 crc kubenswrapper[4901]: I0202 10:52:02.619853 4901 generic.go:334] "Generic (PLEG): container finished" podID="18261baf-6180-4a18-9250-f282d455e91a" containerID="73166d9cb8ff2a0b9fe5aea4a4ae231b6cb193d09504db062ba20d74bbf00000" exitCode=0 Feb 02 10:52:02 crc kubenswrapper[4901]: I0202 10:52:02.619951 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerDied","Data":"73166d9cb8ff2a0b9fe5aea4a4ae231b6cb193d09504db062ba20d74bbf00000"} Feb 02 10:52:03 crc kubenswrapper[4901]: I0202 10:52:03.922956 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.056428 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util\") pod \"18261baf-6180-4a18-9250-f282d455e91a\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.056821 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle\") pod \"18261baf-6180-4a18-9250-f282d455e91a\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.057019 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8t6d\" (UniqueName: \"kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d\") pod \"18261baf-6180-4a18-9250-f282d455e91a\" (UID: \"18261baf-6180-4a18-9250-f282d455e91a\") " Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.057415 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle" (OuterVolumeSpecName: "bundle") pod "18261baf-6180-4a18-9250-f282d455e91a" (UID: "18261baf-6180-4a18-9250-f282d455e91a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.057679 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.064227 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d" (OuterVolumeSpecName: "kube-api-access-s8t6d") pod "18261baf-6180-4a18-9250-f282d455e91a" (UID: "18261baf-6180-4a18-9250-f282d455e91a"). InnerVolumeSpecName "kube-api-access-s8t6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.078019 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util" (OuterVolumeSpecName: "util") pod "18261baf-6180-4a18-9250-f282d455e91a" (UID: "18261baf-6180-4a18-9250-f282d455e91a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.159247 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8t6d\" (UniqueName: \"kubernetes.io/projected/18261baf-6180-4a18-9250-f282d455e91a-kube-api-access-s8t6d\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.159285 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/18261baf-6180-4a18-9250-f282d455e91a-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.638952 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" event={"ID":"18261baf-6180-4a18-9250-f282d455e91a","Type":"ContainerDied","Data":"f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3"} Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.639003 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x" Feb 02 10:52:04 crc kubenswrapper[4901]: I0202 10:52:04.639005 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0aa1e981a59bceb8fb129ef7eae5b1d98e5e38e08ee1e9a5f784658639bfdf3" Feb 02 10:52:07 crc kubenswrapper[4901]: I0202 10:52:07.837105 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:52:07 crc kubenswrapper[4901]: I0202 10:52:07.837980 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.637016 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx"] Feb 02 10:52:10 crc kubenswrapper[4901]: E0202 10:52:10.637633 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="util" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.637648 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="util" Feb 02 10:52:10 crc kubenswrapper[4901]: E0202 10:52:10.637663 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="pull" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.637669 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="pull" Feb 02 10:52:10 crc kubenswrapper[4901]: E0202 10:52:10.637685 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="extract" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.637692 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="extract" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.637809 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="18261baf-6180-4a18-9250-f282d455e91a" containerName="extract" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.638234 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.641699 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-ck7t2" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.663404 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx"] Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.762075 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcdx\" (UniqueName: \"kubernetes.io/projected/dd2442fb-4d95-49c5-b67a-0195ed05bc10-kube-api-access-8vcdx\") pod \"openstack-operator-controller-init-8684f8699c-l6tlx\" (UID: \"dd2442fb-4d95-49c5-b67a-0195ed05bc10\") " pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.864677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcdx\" (UniqueName: \"kubernetes.io/projected/dd2442fb-4d95-49c5-b67a-0195ed05bc10-kube-api-access-8vcdx\") pod \"openstack-operator-controller-init-8684f8699c-l6tlx\" (UID: \"dd2442fb-4d95-49c5-b67a-0195ed05bc10\") " pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.886035 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcdx\" (UniqueName: \"kubernetes.io/projected/dd2442fb-4d95-49c5-b67a-0195ed05bc10-kube-api-access-8vcdx\") pod \"openstack-operator-controller-init-8684f8699c-l6tlx\" (UID: \"dd2442fb-4d95-49c5-b67a-0195ed05bc10\") " pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:10 crc kubenswrapper[4901]: I0202 10:52:10.962063 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:11 crc kubenswrapper[4901]: I0202 10:52:11.452331 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx"] Feb 02 10:52:11 crc kubenswrapper[4901]: I0202 10:52:11.698737 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" event={"ID":"dd2442fb-4d95-49c5-b67a-0195ed05bc10","Type":"ContainerStarted","Data":"d670c497932a02882f7c64f4dfe1172ab9ffd382918b1717a0d01c11f7798c65"} Feb 02 10:52:15 crc kubenswrapper[4901]: I0202 10:52:15.730391 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" event={"ID":"dd2442fb-4d95-49c5-b67a-0195ed05bc10","Type":"ContainerStarted","Data":"b7ddd61edb7736516e74362fa3d6bcba3ffa3bc6624214214e058ddceac6569a"} Feb 02 10:52:15 crc kubenswrapper[4901]: I0202 10:52:15.731596 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:15 crc kubenswrapper[4901]: I0202 10:52:15.775489 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" podStartSLOduration=1.86261795 podStartE2EDuration="5.775465776s" podCreationTimestamp="2026-02-02 10:52:10 +0000 UTC" firstStartedPulling="2026-02-02 10:52:11.467263482 +0000 UTC m=+818.485603578" lastFinishedPulling="2026-02-02 10:52:15.380111298 +0000 UTC m=+822.398451404" observedRunningTime="2026-02-02 10:52:15.77326317 +0000 UTC m=+822.791603276" watchObservedRunningTime="2026-02-02 10:52:15.775465776 +0000 UTC m=+822.793805872" Feb 02 10:52:20 crc kubenswrapper[4901]: I0202 10:52:20.965936 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-8684f8699c-l6tlx" Feb 02 10:52:37 crc kubenswrapper[4901]: I0202 10:52:37.838115 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:52:37 crc kubenswrapper[4901]: I0202 10:52:37.839284 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:52:37 crc kubenswrapper[4901]: I0202 10:52:37.839378 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:52:37 crc kubenswrapper[4901]: I0202 10:52:37.840700 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:52:37 crc kubenswrapper[4901]: I0202 10:52:37.840817 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4" gracePeriod=600 Feb 02 10:52:38 crc kubenswrapper[4901]: I0202 10:52:38.916081 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4" exitCode=0 Feb 02 10:52:38 crc kubenswrapper[4901]: I0202 10:52:38.916214 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4"} Feb 02 10:52:38 crc kubenswrapper[4901]: I0202 10:52:38.917082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50"} Feb 02 10:52:38 crc kubenswrapper[4901]: I0202 10:52:38.917108 4901 scope.go:117] "RemoveContainer" containerID="1a36cd6d270a91931568fc1120bc367e89d666f758bf7db75038745e98b3a488" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.579649 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.581115 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.582757 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fsr6p" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.585846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.593554 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.595454 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.598142 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-plwrp" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.600887 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.601778 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.604185 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6scbv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.619907 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.637021 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.649954 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.650935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.653029 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5phr4" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.655595 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.676684 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.677899 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.682678 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ppcqt" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.688644 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.700637 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.701369 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.701745 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qslpf\" (UniqueName: \"kubernetes.io/projected/a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf-kube-api-access-qslpf\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-lh4js\" (UID: \"a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.701845 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczl7\" (UniqueName: \"kubernetes.io/projected/256717cd-84fb-490a-9945-bed0d1f5ec7f-kube-api-access-vczl7\") pod \"designate-operator-controller-manager-6d9697b7f4-b6wnv\" (UID: \"256717cd-84fb-490a-9945-bed0d1f5ec7f\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.701889 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xl7\" (UniqueName: \"kubernetes.io/projected/50ac57c2-233a-40b9-9377-c8066412240c-kube-api-access-z7xl7\") pod \"cinder-operator-controller-manager-8d874c8fc-kgsvf\" (UID: \"50ac57c2-233a-40b9-9377-c8066412240c\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.710825 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-z2274" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.710982 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.723863 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.724821 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.726646 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-sh9qd" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.757630 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.758384 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.761233 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f9jmd" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.761438 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.770048 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.780020 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.789795 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.790600 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.793999 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-nwn52" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.802921 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.803954 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.805895 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bcb2n" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.806705 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczl7\" (UniqueName: \"kubernetes.io/projected/256717cd-84fb-490a-9945-bed0d1f5ec7f-kube-api-access-vczl7\") pod \"designate-operator-controller-manager-6d9697b7f4-b6wnv\" (UID: \"256717cd-84fb-490a-9945-bed0d1f5ec7f\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.806855 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xl7\" (UniqueName: \"kubernetes.io/projected/50ac57c2-233a-40b9-9377-c8066412240c-kube-api-access-z7xl7\") pod \"cinder-operator-controller-manager-8d874c8fc-kgsvf\" (UID: \"50ac57c2-233a-40b9-9377-c8066412240c\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.806958 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdrg2\" (UniqueName: \"kubernetes.io/projected/79d346c5-abe4-401c-9aaf-b4814a623c99-kube-api-access-sdrg2\") pod \"heat-operator-controller-manager-69d6db494d-7ssmx\" (UID: \"79d346c5-abe4-401c-9aaf-b4814a623c99\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.807063 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wv5v\" (UniqueName: \"kubernetes.io/projected/52cbccb0-76da-4a69-b33f-6efb03721afe-kube-api-access-2wv5v\") pod \"glance-operator-controller-manager-8886f4c47-rjvw7\" (UID: \"52cbccb0-76da-4a69-b33f-6efb03721afe\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.807137 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxwh\" (UniqueName: \"kubernetes.io/projected/3e09ebb7-1669-4027-a2f9-f65176a6a099-kube-api-access-zdxwh\") pod \"horizon-operator-controller-manager-5fb775575f-sm2xf\" (UID: \"3e09ebb7-1669-4027-a2f9-f65176a6a099\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.807286 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qslpf\" (UniqueName: \"kubernetes.io/projected/a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf-kube-api-access-qslpf\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-lh4js\" (UID: \"a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.809346 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.820260 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.832699 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.833432 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.843367 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.844487 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-bx8xx" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.847343 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-vf462"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.848079 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qslpf\" (UniqueName: \"kubernetes.io/projected/a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf-kube-api-access-qslpf\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-lh4js\" (UID: \"a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.848129 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.849759 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c9lvd" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.859273 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.860627 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.864405 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jl9lk" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.866059 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-vf462"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.871932 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.873040 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.876816 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rrdbs" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.880709 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.881341 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xl7\" (UniqueName: \"kubernetes.io/projected/50ac57c2-233a-40b9-9377-c8066412240c-kube-api-access-z7xl7\") pod \"cinder-operator-controller-manager-8d874c8fc-kgsvf\" (UID: \"50ac57c2-233a-40b9-9377-c8066412240c\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.889823 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczl7\" (UniqueName: \"kubernetes.io/projected/256717cd-84fb-490a-9945-bed0d1f5ec7f-kube-api-access-vczl7\") pod \"designate-operator-controller-manager-6d9697b7f4-b6wnv\" (UID: \"256717cd-84fb-490a-9945-bed0d1f5ec7f\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.907816 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908098 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zhm\" (UniqueName: \"kubernetes.io/projected/42ecefa7-b29d-4178-82c0-5520874c1d1a-kube-api-access-t9zhm\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908158 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdrg2\" (UniqueName: \"kubernetes.io/projected/79d346c5-abe4-401c-9aaf-b4814a623c99-kube-api-access-sdrg2\") pod \"heat-operator-controller-manager-69d6db494d-7ssmx\" (UID: \"79d346c5-abe4-401c-9aaf-b4814a623c99\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908186 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wv5v\" (UniqueName: \"kubernetes.io/projected/52cbccb0-76da-4a69-b33f-6efb03721afe-kube-api-access-2wv5v\") pod \"glance-operator-controller-manager-8886f4c47-rjvw7\" (UID: \"52cbccb0-76da-4a69-b33f-6efb03721afe\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908208 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxwh\" (UniqueName: \"kubernetes.io/projected/3e09ebb7-1669-4027-a2f9-f65176a6a099-kube-api-access-zdxwh\") pod \"horizon-operator-controller-manager-5fb775575f-sm2xf\" (UID: \"3e09ebb7-1669-4027-a2f9-f65176a6a099\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908248 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xtr\" (UniqueName: \"kubernetes.io/projected/f5f1df90-8dfb-4eae-b0bb-6128aab24030-kube-api-access-84xtr\") pod \"ironic-operator-controller-manager-5f4b8bd54d-wccg8\" (UID: \"f5f1df90-8dfb-4eae-b0bb-6128aab24030\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908274 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blt7\" (UniqueName: \"kubernetes.io/projected/c7d3103d-1aa3-4337-8e01-f60aed47ca9b-kube-api-access-2blt7\") pod \"manila-operator-controller-manager-7dd968899f-qnjn5\" (UID: \"c7d3103d-1aa3-4337-8e01-f60aed47ca9b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908296 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.908351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wtw\" (UniqueName: \"kubernetes.io/projected/3be47ad9-6e38-4b16-9e57-2311ef26ed5b-kube-api-access-86wtw\") pod \"keystone-operator-controller-manager-84f48565d4-f5dcq\" (UID: \"3be47ad9-6e38-4b16-9e57-2311ef26ed5b\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.913818 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.929181 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.956813 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.958196 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxwh\" (UniqueName: \"kubernetes.io/projected/3e09ebb7-1669-4027-a2f9-f65176a6a099-kube-api-access-zdxwh\") pod \"horizon-operator-controller-manager-5fb775575f-sm2xf\" (UID: \"3e09ebb7-1669-4027-a2f9-f65176a6a099\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.959322 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.960274 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.965272 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.965804 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gcwxr" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.972787 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wv5v\" (UniqueName: \"kubernetes.io/projected/52cbccb0-76da-4a69-b33f-6efb03721afe-kube-api-access-2wv5v\") pod \"glance-operator-controller-manager-8886f4c47-rjvw7\" (UID: \"52cbccb0-76da-4a69-b33f-6efb03721afe\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.981746 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdrg2\" (UniqueName: \"kubernetes.io/projected/79d346c5-abe4-401c-9aaf-b4814a623c99-kube-api-access-sdrg2\") pod \"heat-operator-controller-manager-69d6db494d-7ssmx\" (UID: \"79d346c5-abe4-401c-9aaf-b4814a623c99\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.981819 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc"] Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.984244 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.991356 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:00 crc kubenswrapper[4901]: I0202 10:53:00.995071 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qpknr" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.000110 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010007 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh8tb\" (UniqueName: \"kubernetes.io/projected/c92912da-94c2-41b5-b43c-a136f96dbd1e-kube-api-access-vh8tb\") pod \"octavia-operator-controller-manager-6687f8d877-827zp\" (UID: \"c92912da-94c2-41b5-b43c-a136f96dbd1e\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010053 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85r9\" (UniqueName: \"kubernetes.io/projected/4d052f6a-df39-4bd2-aee5-8abd7a1a2882-kube-api-access-n85r9\") pod \"mariadb-operator-controller-manager-67bf948998-rmwtz\" (UID: \"4d052f6a-df39-4bd2-aee5-8abd7a1a2882\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010084 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wtw\" (UniqueName: \"kubernetes.io/projected/3be47ad9-6e38-4b16-9e57-2311ef26ed5b-kube-api-access-86wtw\") pod \"keystone-operator-controller-manager-84f48565d4-f5dcq\" (UID: \"3be47ad9-6e38-4b16-9e57-2311ef26ed5b\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010108 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrn2p\" (UniqueName: \"kubernetes.io/projected/57f12111-0feb-4c93-8e3a-c0d36dee5184-kube-api-access-lrn2p\") pod \"nova-operator-controller-manager-55bff696bd-7ktm2\" (UID: \"57f12111-0feb-4c93-8e3a-c0d36dee5184\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010134 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zhm\" (UniqueName: \"kubernetes.io/projected/42ecefa7-b29d-4178-82c0-5520874c1d1a-kube-api-access-t9zhm\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010173 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrtfn\" (UniqueName: \"kubernetes.io/projected/2077e455-81ea-4c9a-b4cf-1304d990ee88-kube-api-access-lrtfn\") pod \"neutron-operator-controller-manager-585dbc889-vf462\" (UID: \"2077e455-81ea-4c9a-b4cf-1304d990ee88\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010198 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84xtr\" (UniqueName: \"kubernetes.io/projected/f5f1df90-8dfb-4eae-b0bb-6128aab24030-kube-api-access-84xtr\") pod \"ironic-operator-controller-manager-5f4b8bd54d-wccg8\" (UID: \"f5f1df90-8dfb-4eae-b0bb-6128aab24030\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010219 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2blt7\" (UniqueName: \"kubernetes.io/projected/c7d3103d-1aa3-4337-8e01-f60aed47ca9b-kube-api-access-2blt7\") pod \"manila-operator-controller-manager-7dd968899f-qnjn5\" (UID: \"c7d3103d-1aa3-4337-8e01-f60aed47ca9b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.010239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.010357 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.010403 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:01.510388827 +0000 UTC m=+868.528728923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.012302 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.012869 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.015646 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.021186 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6htl8" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.031736 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xtr\" (UniqueName: \"kubernetes.io/projected/f5f1df90-8dfb-4eae-b0bb-6128aab24030-kube-api-access-84xtr\") pod \"ironic-operator-controller-manager-5f4b8bd54d-wccg8\" (UID: \"f5f1df90-8dfb-4eae-b0bb-6128aab24030\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.033095 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.034918 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blt7\" (UniqueName: \"kubernetes.io/projected/c7d3103d-1aa3-4337-8e01-f60aed47ca9b-kube-api-access-2blt7\") pod \"manila-operator-controller-manager-7dd968899f-qnjn5\" (UID: \"c7d3103d-1aa3-4337-8e01-f60aed47ca9b\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.036414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wtw\" (UniqueName: \"kubernetes.io/projected/3be47ad9-6e38-4b16-9e57-2311ef26ed5b-kube-api-access-86wtw\") pod \"keystone-operator-controller-manager-84f48565d4-f5dcq\" (UID: \"3be47ad9-6e38-4b16-9e57-2311ef26ed5b\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.039342 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.041336 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zhm\" (UniqueName: \"kubernetes.io/projected/42ecefa7-b29d-4178-82c0-5520874c1d1a-kube-api-access-t9zhm\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.049460 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.050404 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.053987 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-n84wj" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.055292 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.057367 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.072125 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.073093 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.075738 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-znkdt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.093310 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.103819 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112647 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnkwk\" (UniqueName: \"kubernetes.io/projected/27222341-69e8-4b3c-b6c2-e3d5c644e8c3-kube-api-access-cnkwk\") pod \"placement-operator-controller-manager-5b964cf4cd-dxjpt\" (UID: \"27222341-69e8-4b3c-b6c2-e3d5c644e8c3\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112716 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8blb\" (UniqueName: \"kubernetes.io/projected/4c4d76b0-aadf-4949-a131-a43c226e38a2-kube-api-access-j8blb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112735 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112762 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh8tb\" (UniqueName: \"kubernetes.io/projected/c92912da-94c2-41b5-b43c-a136f96dbd1e-kube-api-access-vh8tb\") pod \"octavia-operator-controller-manager-6687f8d877-827zp\" (UID: \"c92912da-94c2-41b5-b43c-a136f96dbd1e\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112785 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85r9\" (UniqueName: \"kubernetes.io/projected/4d052f6a-df39-4bd2-aee5-8abd7a1a2882-kube-api-access-n85r9\") pod \"mariadb-operator-controller-manager-67bf948998-rmwtz\" (UID: \"4d052f6a-df39-4bd2-aee5-8abd7a1a2882\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112811 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrn2p\" (UniqueName: \"kubernetes.io/projected/57f12111-0feb-4c93-8e3a-c0d36dee5184-kube-api-access-lrn2p\") pod \"nova-operator-controller-manager-55bff696bd-7ktm2\" (UID: \"57f12111-0feb-4c93-8e3a-c0d36dee5184\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112854 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vgth\" (UniqueName: \"kubernetes.io/projected/3952ac22-26a4-4b08-a45c-9d8db8597333-kube-api-access-9vgth\") pod \"ovn-operator-controller-manager-788c46999f-vccwc\" (UID: \"3952ac22-26a4-4b08-a45c-9d8db8597333\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.112879 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrtfn\" (UniqueName: \"kubernetes.io/projected/2077e455-81ea-4c9a-b4cf-1304d990ee88-kube-api-access-lrtfn\") pod \"neutron-operator-controller-manager-585dbc889-vf462\" (UID: \"2077e455-81ea-4c9a-b4cf-1304d990ee88\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.123033 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.129783 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.131342 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrn2p\" (UniqueName: \"kubernetes.io/projected/57f12111-0feb-4c93-8e3a-c0d36dee5184-kube-api-access-lrn2p\") pod \"nova-operator-controller-manager-55bff696bd-7ktm2\" (UID: \"57f12111-0feb-4c93-8e3a-c0d36dee5184\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.133457 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.134959 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85r9\" (UniqueName: \"kubernetes.io/projected/4d052f6a-df39-4bd2-aee5-8abd7a1a2882-kube-api-access-n85r9\") pod \"mariadb-operator-controller-manager-67bf948998-rmwtz\" (UID: \"4d052f6a-df39-4bd2-aee5-8abd7a1a2882\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.146074 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh8tb\" (UniqueName: \"kubernetes.io/projected/c92912da-94c2-41b5-b43c-a136f96dbd1e-kube-api-access-vh8tb\") pod \"octavia-operator-controller-manager-6687f8d877-827zp\" (UID: \"c92912da-94c2-41b5-b43c-a136f96dbd1e\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.149120 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.150529 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.151120 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrtfn\" (UniqueName: \"kubernetes.io/projected/2077e455-81ea-4c9a-b4cf-1304d990ee88-kube-api-access-lrtfn\") pod \"neutron-operator-controller-manager-585dbc889-vf462\" (UID: \"2077e455-81ea-4c9a-b4cf-1304d990ee88\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.159838 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kr5m6" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218178 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zll5f\" (UniqueName: \"kubernetes.io/projected/3676f3d9-d77a-4809-bf9e-0e5ba2bea27c-kube-api-access-zll5f\") pod \"swift-operator-controller-manager-68fc8c869-xkdvn\" (UID: \"3676f3d9-d77a-4809-bf9e-0e5ba2bea27c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218277 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vgth\" (UniqueName: \"kubernetes.io/projected/3952ac22-26a4-4b08-a45c-9d8db8597333-kube-api-access-9vgth\") pod \"ovn-operator-controller-manager-788c46999f-vccwc\" (UID: \"3952ac22-26a4-4b08-a45c-9d8db8597333\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218327 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z74d\" (UniqueName: \"kubernetes.io/projected/9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d-kube-api-access-2z74d\") pod \"telemetry-operator-controller-manager-84dbcd4d6-strlk\" (UID: \"9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d\") " pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnkwk\" (UniqueName: \"kubernetes.io/projected/27222341-69e8-4b3c-b6c2-e3d5c644e8c3-kube-api-access-cnkwk\") pod \"placement-operator-controller-manager-5b964cf4cd-dxjpt\" (UID: \"27222341-69e8-4b3c-b6c2-e3d5c644e8c3\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218434 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8blb\" (UniqueName: \"kubernetes.io/projected/4c4d76b0-aadf-4949-a131-a43c226e38a2-kube-api-access-j8blb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.218456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.220481 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.220536 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert podName:4c4d76b0-aadf-4949-a131-a43c226e38a2 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:01.720518041 +0000 UTC m=+868.738858137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" (UID: "4c4d76b0-aadf-4949-a131-a43c226e38a2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.233128 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-njrpb"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.237188 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.239493 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-njrpb"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.240740 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kzgv4" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.244376 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vgth\" (UniqueName: \"kubernetes.io/projected/3952ac22-26a4-4b08-a45c-9d8db8597333-kube-api-access-9vgth\") pod \"ovn-operator-controller-manager-788c46999f-vccwc\" (UID: \"3952ac22-26a4-4b08-a45c-9d8db8597333\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.244473 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8blb\" (UniqueName: \"kubernetes.io/projected/4c4d76b0-aadf-4949-a131-a43c226e38a2-kube-api-access-j8blb\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.271433 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.285240 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnkwk\" (UniqueName: \"kubernetes.io/projected/27222341-69e8-4b3c-b6c2-e3d5c644e8c3-kube-api-access-cnkwk\") pod \"placement-operator-controller-manager-5b964cf4cd-dxjpt\" (UID: \"27222341-69e8-4b3c-b6c2-e3d5c644e8c3\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.328046 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z74d\" (UniqueName: \"kubernetes.io/projected/9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d-kube-api-access-2z74d\") pod \"telemetry-operator-controller-manager-84dbcd4d6-strlk\" (UID: \"9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d\") " pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.328343 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x2kj\" (UniqueName: \"kubernetes.io/projected/93760c22-c570-47d0-a0a8-a0e089ee1461-kube-api-access-2x2kj\") pod \"test-operator-controller-manager-56f8bfcd9f-vtjmm\" (UID: \"93760c22-c570-47d0-a0a8-a0e089ee1461\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.328734 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zll5f\" (UniqueName: \"kubernetes.io/projected/3676f3d9-d77a-4809-bf9e-0e5ba2bea27c-kube-api-access-zll5f\") pod \"swift-operator-controller-manager-68fc8c869-xkdvn\" (UID: \"3676f3d9-d77a-4809-bf9e-0e5ba2bea27c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.332151 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.341693 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.365415 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.371690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z74d\" (UniqueName: \"kubernetes.io/projected/9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d-kube-api-access-2z74d\") pod \"telemetry-operator-controller-manager-84dbcd4d6-strlk\" (UID: \"9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d\") " pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.388908 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zll5f\" (UniqueName: \"kubernetes.io/projected/3676f3d9-d77a-4809-bf9e-0e5ba2bea27c-kube-api-access-zll5f\") pod \"swift-operator-controller-manager-68fc8c869-xkdvn\" (UID: \"3676f3d9-d77a-4809-bf9e-0e5ba2bea27c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.407424 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.413609 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.415151 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.421504 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-q5fmr" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.421677 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.421726 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.433270 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbhfh\" (UniqueName: \"kubernetes.io/projected/9f38ac2f-605c-413e-8bdc-ae236d52bd55-kube-api-access-wbhfh\") pod \"watcher-operator-controller-manager-564965969-njrpb\" (UID: \"9f38ac2f-605c-413e-8bdc-ae236d52bd55\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.433340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x2kj\" (UniqueName: \"kubernetes.io/projected/93760c22-c570-47d0-a0a8-a0e089ee1461-kube-api-access-2x2kj\") pod \"test-operator-controller-manager-56f8bfcd9f-vtjmm\" (UID: \"93760c22-c570-47d0-a0a8-a0e089ee1461\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.436174 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.441810 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.454325 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x2kj\" (UniqueName: \"kubernetes.io/projected/93760c22-c570-47d0-a0a8-a0e089ee1461-kube-api-access-2x2kj\") pod \"test-operator-controller-manager-56f8bfcd9f-vtjmm\" (UID: \"93760c22-c570-47d0-a0a8-a0e089ee1461\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.457153 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.505620 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.523259 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.524159 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.539379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.539512 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.539613 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.539738 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbhfh\" (UniqueName: \"kubernetes.io/projected/9f38ac2f-605c-413e-8bdc-ae236d52bd55-kube-api-access-wbhfh\") pod \"watcher-operator-controller-manager-564965969-njrpb\" (UID: \"9f38ac2f-605c-413e-8bdc-ae236d52bd55\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.539855 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6zn2\" (UniqueName: \"kubernetes.io/projected/626937bd-8794-43dd-ab0a-77a94440bb05-kube-api-access-f6zn2\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.539869 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.539960 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:02.539933815 +0000 UTC m=+869.558273911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.543403 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-s2rbj" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.543670 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.577697 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.592822 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbhfh\" (UniqueName: \"kubernetes.io/projected/9f38ac2f-605c-413e-8bdc-ae236d52bd55-kube-api-access-wbhfh\") pod \"watcher-operator-controller-manager-564965969-njrpb\" (UID: \"9f38ac2f-605c-413e-8bdc-ae236d52bd55\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.593299 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.643142 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.643205 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.643273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hdz6\" (UniqueName: \"kubernetes.io/projected/2d39bba8-c1e7-4247-b938-616c9774c9a7-kube-api-access-8hdz6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6td2q\" (UID: \"2d39bba8-c1e7-4247-b938-616c9774c9a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.643294 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6zn2\" (UniqueName: \"kubernetes.io/projected/626937bd-8794-43dd-ab0a-77a94440bb05-kube-api-access-f6zn2\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.643793 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.643839 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:02.143825877 +0000 UTC m=+869.162165973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.643932 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.644032 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:02.144005042 +0000 UTC m=+869.162345138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "metrics-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.674217 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6zn2\" (UniqueName: \"kubernetes.io/projected/626937bd-8794-43dd-ab0a-77a94440bb05-kube-api-access-f6zn2\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.717994 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.718041 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js"] Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.748756 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.749022 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hdz6\" (UniqueName: \"kubernetes.io/projected/2d39bba8-c1e7-4247-b938-616c9774c9a7-kube-api-access-8hdz6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6td2q\" (UID: \"2d39bba8-c1e7-4247-b938-616c9774c9a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.749392 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: E0202 10:53:01.749465 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert podName:4c4d76b0-aadf-4949-a131-a43c226e38a2 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:02.749445094 +0000 UTC m=+869.767785230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" (UID: "4c4d76b0-aadf-4949-a131-a43c226e38a2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.783471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hdz6\" (UniqueName: \"kubernetes.io/projected/2d39bba8-c1e7-4247-b938-616c9774c9a7-kube-api-access-8hdz6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6td2q\" (UID: \"2d39bba8-c1e7-4247-b938-616c9774c9a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" Feb 02 10:53:01 crc kubenswrapper[4901]: I0202 10:53:01.888354 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.107189 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" event={"ID":"256717cd-84fb-490a-9945-bed0d1f5ec7f","Type":"ContainerStarted","Data":"b8edb65ad25b2f3d5243ada6c3ebab5ff37d603999dc540da97b9e706343ed4f"} Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.108084 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" event={"ID":"a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf","Type":"ContainerStarted","Data":"808a9c27ac7ddfc29b6d0c8f36c9bd9447cae85323ecd9a6a9b1c20745886990"} Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.133863 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.142959 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.152520 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.157263 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.157322 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.157499 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.157565 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:03.157548848 +0000 UTC m=+870.175888944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "metrics-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.157910 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.157980 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.158003 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:03.157996839 +0000 UTC m=+870.176336935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "webhook-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.159006 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e09ebb7_1669_4027_a2f9_f65176a6a099.slice/crio-ad5d10030b670c235535e2e45fcbd5d37f5acea693ddee1339cd9f3b56db4678 WatchSource:0}: Error finding container ad5d10030b670c235535e2e45fcbd5d37f5acea693ddee1339cd9f3b56db4678: Status 404 returned error can't find the container with id ad5d10030b670c235535e2e45fcbd5d37f5acea693ddee1339cd9f3b56db4678 Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.263809 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f1df90_8dfb_4eae_b0bb_6128aab24030.slice/crio-1ec2dfeaea7f78b2ff200158ca8fd2f1901b646025959ca4786be8d35ad7878a WatchSource:0}: Error finding container 1ec2dfeaea7f78b2ff200158ca8fd2f1901b646025959ca4786be8d35ad7878a: Status 404 returned error can't find the container with id 1ec2dfeaea7f78b2ff200158ca8fd2f1901b646025959ca4786be8d35ad7878a Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.269650 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.277888 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d3103d_1aa3_4337_8e01_f60aed47ca9b.slice/crio-cbc3d29afc68c08415210a952da8d87b43dfdd0bc22ddc8b7eecc3d8962b09f0 WatchSource:0}: Error finding container cbc3d29afc68c08415210a952da8d87b43dfdd0bc22ddc8b7eecc3d8962b09f0: Status 404 returned error can't find the container with id cbc3d29afc68c08415210a952da8d87b43dfdd0bc22ddc8b7eecc3d8962b09f0 Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.284166 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.300150 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.364184 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq"] Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.376255 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.381458 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3676f3d9_d77a_4809_bf9e_0e5ba2bea27c.slice/crio-056785d8e6966b1a1941203b6637c8590e6049683e0225df7e6ecd55e500468b WatchSource:0}: Error finding container 056785d8e6966b1a1941203b6637c8590e6049683e0225df7e6ecd55e500468b: Status 404 returned error can't find the container with id 056785d8e6966b1a1941203b6637c8590e6049683e0225df7e6ecd55e500468b Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.383654 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.389452 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2077e455_81ea_4c9a_b4cf_1304d990ee88.slice/crio-f87373376c7270c6fbe23d0d8d10e10c04090a970616bac92978c8bf3930d29b WatchSource:0}: Error finding container f87373376c7270c6fbe23d0d8d10e10c04090a970616bac92978c8bf3930d29b: Status 404 returned error can't find the container with id f87373376c7270c6fbe23d0d8d10e10c04090a970616bac92978c8bf3930d29b Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.393656 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-vf462"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.401020 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vgth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-vccwc_openstack-operators(3952ac22-26a4-4b08-a45c-9d8db8597333): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.402161 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" podUID="3952ac22-26a4-4b08-a45c-9d8db8597333" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.402993 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.404528 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnkwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-dxjpt_openstack-operators(27222341-69e8-4b3c-b6c2-e3d5c644e8c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.414686 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" podUID="27222341-69e8-4b3c-b6c2-e3d5c644e8c3" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.420393 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.516807 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f38ac2f_605c_413e_8bdc_ae236d52bd55.slice/crio-397e99f639f0bb3bd09b67c4b54c7515cb6945963c2d5996f474048789d1e17a WatchSource:0}: Error finding container 397e99f639f0bb3bd09b67c4b54c7515cb6945963c2d5996f474048789d1e17a: Status 404 returned error can't find the container with id 397e99f639f0bb3bd09b67c4b54c7515cb6945963c2d5996f474048789d1e17a Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.519551 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-njrpb"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.520644 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbhfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-njrpb_openstack-operators(9f38ac2f-605c-413e-8bdc-ae236d52bd55): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.521881 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" podUID="9f38ac2f-605c-413e-8bdc-ae236d52bd55" Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.523375 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93760c22_c570_47d0_a0a8_a0e089ee1461.slice/crio-0a78ffc2dfeaa68502caac78af189932ed28e4344a3660fe0787951fdacf6ea6 WatchSource:0}: Error finding container 0a78ffc2dfeaa68502caac78af189932ed28e4344a3660fe0787951fdacf6ea6: Status 404 returned error can't find the container with id 0a78ffc2dfeaa68502caac78af189932ed28e4344a3660fe0787951fdacf6ea6 Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.524692 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.526036 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2x2kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-vtjmm_openstack-operators(93760c22-c570-47d0-a0a8-a0e089ee1461): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.528223 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" podUID="93760c22-c570-47d0-a0a8-a0e089ee1461" Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.528311 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d39bba8_c1e7_4247_b938_616c9774c9a7.slice/crio-0aac58d8e4a91776ac52add36c52c9c4e39215c139cf814d8c24d9c0f579b956 WatchSource:0}: Error finding container 0aac58d8e4a91776ac52add36c52c9c4e39215c139cf814d8c24d9c0f579b956: Status 404 returned error can't find the container with id 0aac58d8e4a91776ac52add36c52c9c4e39215c139cf814d8c24d9c0f579b956 Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.531941 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8hdz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6td2q_openstack-operators(2d39bba8-c1e7-4247-b938-616c9774c9a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.532017 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q"] Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.533348 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" podUID="2d39bba8-c1e7-4247-b938-616c9774c9a7" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.536373 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.538539 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2b3b8f_1088_4696_899e_d0d5c3f2bf2d.slice/crio-fbb8ab49e1be85a4ad2a3bc237f64aeb9751e2e50cf75ff236a5621140080b09 WatchSource:0}: Error finding container fbb8ab49e1be85a4ad2a3bc237f64aeb9751e2e50cf75ff236a5621140080b09: Status 404 returned error can't find the container with id fbb8ab49e1be85a4ad2a3bc237f64aeb9751e2e50cf75ff236a5621140080b09 Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.540786 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp"] Feb 02 10:53:02 crc kubenswrapper[4901]: W0202 10:53:02.541698 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92912da_94c2_41b5_b43c_a136f96dbd1e.slice/crio-1390fbb21b1cb8fd6940aa60d718e7e94a422348d49af5efe1257664c4384907 WatchSource:0}: Error finding container 1390fbb21b1cb8fd6940aa60d718e7e94a422348d49af5efe1257664c4384907: Status 404 returned error can't find the container with id 1390fbb21b1cb8fd6940aa60d718e7e94a422348d49af5efe1257664c4384907 Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.542365 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.145:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2z74d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-84dbcd4d6-strlk_openstack-operators(9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.543599 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" podUID="9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.544689 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vh8tb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-827zp_openstack-operators(c92912da-94c2-41b5-b43c-a136f96dbd1e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.546004 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" podUID="c92912da-94c2-41b5-b43c-a136f96dbd1e" Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.565662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.566000 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.566128 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:04.566085333 +0000 UTC m=+871.584425429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: I0202 10:53:02.769388 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.769684 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:02 crc kubenswrapper[4901]: E0202 10:53:02.769753 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert podName:4c4d76b0-aadf-4949-a131-a43c226e38a2 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:04.769734534 +0000 UTC m=+871.788074630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" (UID: "4c4d76b0-aadf-4949-a131-a43c226e38a2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.124472 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" event={"ID":"2d39bba8-c1e7-4247-b938-616c9774c9a7","Type":"ContainerStarted","Data":"0aac58d8e4a91776ac52add36c52c9c4e39215c139cf814d8c24d9c0f579b956"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.127705 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" podUID="2d39bba8-c1e7-4247-b938-616c9774c9a7" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.128607 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" event={"ID":"c7d3103d-1aa3-4337-8e01-f60aed47ca9b","Type":"ContainerStarted","Data":"cbc3d29afc68c08415210a952da8d87b43dfdd0bc22ddc8b7eecc3d8962b09f0"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.134329 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" event={"ID":"57f12111-0feb-4c93-8e3a-c0d36dee5184","Type":"ContainerStarted","Data":"7b88b7a8c94b0e4f5ebda3ff220c7b1b1a8118e9a72d494716eb470ba4796eb3"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.142025 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" event={"ID":"3952ac22-26a4-4b08-a45c-9d8db8597333","Type":"ContainerStarted","Data":"edb11e2484c5e96e383a5fd7d746abc1ef4b32f05cf0345849f3fde2ab6f9862"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.144203 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" event={"ID":"79d346c5-abe4-401c-9aaf-b4814a623c99","Type":"ContainerStarted","Data":"2a91409aaf8fd4b2de3907f83b516f0f9b937515123c8696577b0ade24c56bdd"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.144300 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" podUID="3952ac22-26a4-4b08-a45c-9d8db8597333" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.163037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" event={"ID":"4d052f6a-df39-4bd2-aee5-8abd7a1a2882","Type":"ContainerStarted","Data":"110aa6df4da650885aeb55077a49129004c78f48b428512da7c7225e8120ed20"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.166037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" event={"ID":"9f38ac2f-605c-413e-8bdc-ae236d52bd55","Type":"ContainerStarted","Data":"397e99f639f0bb3bd09b67c4b54c7515cb6945963c2d5996f474048789d1e17a"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.168552 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" podUID="9f38ac2f-605c-413e-8bdc-ae236d52bd55" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.168811 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" event={"ID":"3e09ebb7-1669-4027-a2f9-f65176a6a099","Type":"ContainerStarted","Data":"ad5d10030b670c235535e2e45fcbd5d37f5acea693ddee1339cd9f3b56db4678"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.173035 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" event={"ID":"c92912da-94c2-41b5-b43c-a136f96dbd1e","Type":"ContainerStarted","Data":"1390fbb21b1cb8fd6940aa60d718e7e94a422348d49af5efe1257664c4384907"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.175050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.175187 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.175320 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.175406 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" event={"ID":"f5f1df90-8dfb-4eae-b0bb-6128aab24030","Type":"ContainerStarted","Data":"1ec2dfeaea7f78b2ff200158ca8fd2f1901b646025959ca4786be8d35ad7878a"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.175498 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.175579 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:05.175366367 +0000 UTC m=+872.193706463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "webhook-server-cert" not found Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.175609 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:05.175600082 +0000 UTC m=+872.193940178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "metrics-server-cert" not found Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.176980 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" podUID="c92912da-94c2-41b5-b43c-a136f96dbd1e" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.178135 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" event={"ID":"50ac57c2-233a-40b9-9377-c8066412240c","Type":"ContainerStarted","Data":"60c836a4ee6b7216927215cdedee9e9a088d9446813d422b78d751c980acc7cd"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.179392 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" event={"ID":"2077e455-81ea-4c9a-b4cf-1304d990ee88","Type":"ContainerStarted","Data":"f87373376c7270c6fbe23d0d8d10e10c04090a970616bac92978c8bf3930d29b"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.188768 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" event={"ID":"52cbccb0-76da-4a69-b33f-6efb03721afe","Type":"ContainerStarted","Data":"b692484a66f5181f2e926dda19e9c6e0e218f51d275de87b8895a7924c4d3db3"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.216968 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" event={"ID":"9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d","Type":"ContainerStarted","Data":"fbb8ab49e1be85a4ad2a3bc237f64aeb9751e2e50cf75ff236a5621140080b09"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.218970 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.145:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" podUID="9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.226885 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" event={"ID":"93760c22-c570-47d0-a0a8-a0e089ee1461","Type":"ContainerStarted","Data":"0a78ffc2dfeaa68502caac78af189932ed28e4344a3660fe0787951fdacf6ea6"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.229238 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" podUID="93760c22-c570-47d0-a0a8-a0e089ee1461" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.231328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" event={"ID":"27222341-69e8-4b3c-b6c2-e3d5c644e8c3","Type":"ContainerStarted","Data":"8c253923817477026c4e014468f3215d8460786faf419697a17efbf9104861ee"} Feb 02 10:53:03 crc kubenswrapper[4901]: E0202 10:53:03.232244 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" podUID="27222341-69e8-4b3c-b6c2-e3d5c644e8c3" Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.238542 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" event={"ID":"3be47ad9-6e38-4b16-9e57-2311ef26ed5b","Type":"ContainerStarted","Data":"42c83e1403d36f6007c04ce5404aee63300e9d85004b68421990a2eb3cd29970"} Feb 02 10:53:03 crc kubenswrapper[4901]: I0202 10:53:03.246810 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" event={"ID":"3676f3d9-d77a-4809-bf9e-0e5ba2bea27c","Type":"ContainerStarted","Data":"056785d8e6966b1a1941203b6637c8590e6049683e0225df7e6ecd55e500468b"} Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.259794 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" podUID="27222341-69e8-4b3c-b6c2-e3d5c644e8c3" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.259886 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" podUID="c92912da-94c2-41b5-b43c-a136f96dbd1e" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.260220 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" podUID="3952ac22-26a4-4b08-a45c-9d8db8597333" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.260165 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" podUID="9f38ac2f-605c-413e-8bdc-ae236d52bd55" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.261683 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.145:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" podUID="9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.261684 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" podUID="2d39bba8-c1e7-4247-b938-616c9774c9a7" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.261865 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" podUID="93760c22-c570-47d0-a0a8-a0e089ee1461" Feb 02 10:53:04 crc kubenswrapper[4901]: I0202 10:53:04.608669 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.608949 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.609071 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:08.609042224 +0000 UTC m=+875.627382320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:04 crc kubenswrapper[4901]: I0202 10:53:04.812016 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.812237 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:04 crc kubenswrapper[4901]: E0202 10:53:04.812323 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert podName:4c4d76b0-aadf-4949-a131-a43c226e38a2 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:08.812296746 +0000 UTC m=+875.830636882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" (UID: "4c4d76b0-aadf-4949-a131-a43c226e38a2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:05 crc kubenswrapper[4901]: I0202 10:53:05.217066 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:05 crc kubenswrapper[4901]: I0202 10:53:05.217123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:05 crc kubenswrapper[4901]: E0202 10:53:05.217229 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:53:05 crc kubenswrapper[4901]: E0202 10:53:05.217262 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:53:05 crc kubenswrapper[4901]: E0202 10:53:05.217292 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:09.217276811 +0000 UTC m=+876.235616907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "webhook-server-cert" not found Feb 02 10:53:05 crc kubenswrapper[4901]: E0202 10:53:05.217306 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:09.217300852 +0000 UTC m=+876.235640958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "metrics-server-cert" not found Feb 02 10:53:08 crc kubenswrapper[4901]: I0202 10:53:08.674542 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:08 crc kubenswrapper[4901]: E0202 10:53:08.674747 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:08 crc kubenswrapper[4901]: E0202 10:53:08.675136 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:16.675117028 +0000 UTC m=+883.693457124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:08 crc kubenswrapper[4901]: I0202 10:53:08.878714 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:08 crc kubenswrapper[4901]: E0202 10:53:08.879242 4901 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:08 crc kubenswrapper[4901]: E0202 10:53:08.879302 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert podName:4c4d76b0-aadf-4949-a131-a43c226e38a2 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:16.879284303 +0000 UTC m=+883.897624399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" (UID: "4c4d76b0-aadf-4949-a131-a43c226e38a2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:53:09 crc kubenswrapper[4901]: I0202 10:53:09.282869 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:09 crc kubenswrapper[4901]: I0202 10:53:09.282943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:09 crc kubenswrapper[4901]: E0202 10:53:09.283001 4901 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:53:09 crc kubenswrapper[4901]: E0202 10:53:09.283065 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:17.283048777 +0000 UTC m=+884.301388873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "webhook-server-cert" not found Feb 02 10:53:09 crc kubenswrapper[4901]: E0202 10:53:09.283117 4901 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:53:09 crc kubenswrapper[4901]: E0202 10:53:09.283149 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs podName:626937bd-8794-43dd-ab0a-77a94440bb05 nodeName:}" failed. No retries permitted until 2026-02-02 10:53:17.283139969 +0000 UTC m=+884.301480065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs") pod "openstack-operator-controller-manager-6765c97497-ngsw7" (UID: "626937bd-8794-43dd-ab0a-77a94440bb05") : secret "metrics-server-cert" not found Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.334617 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" event={"ID":"256717cd-84fb-490a-9945-bed0d1f5ec7f","Type":"ContainerStarted","Data":"5621944e8d784eaa553145e1dd1d2805789a5667559e155780f922a4d5219ac3"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.336472 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.338865 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" event={"ID":"c7d3103d-1aa3-4337-8e01-f60aed47ca9b","Type":"ContainerStarted","Data":"c54a473cdfe45863d508c78039aedc534f2b0da2609f6fe0886dd43ef4cfd770"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.339281 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.345211 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" event={"ID":"57f12111-0feb-4c93-8e3a-c0d36dee5184","Type":"ContainerStarted","Data":"1b29e3ffd7aaaa8bcf2aba8acf8df36b2ce488fe6e06c6b0ce0a2be41b1d2e1a"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.345339 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.348300 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" event={"ID":"79d346c5-abe4-401c-9aaf-b4814a623c99","Type":"ContainerStarted","Data":"fbfb691134f40fe86a5a4f8ae8ebf5a9dae011e80d596f3a8f59de2677c0f20d"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.348439 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.355233 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" event={"ID":"2077e455-81ea-4c9a-b4cf-1304d990ee88","Type":"ContainerStarted","Data":"aa29cd711759f103bcd07614e97e3c73c22df2c7ac6ad1ab82629fab1a609222"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.355888 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.357431 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" event={"ID":"3e09ebb7-1669-4027-a2f9-f65176a6a099","Type":"ContainerStarted","Data":"337bc5d7043b7a07981b11bdb325b285c480058be32a3fd34128bc7400ace6a6"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.357633 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.364662 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" event={"ID":"3be47ad9-6e38-4b16-9e57-2311ef26ed5b","Type":"ContainerStarted","Data":"521535f713ae0c49e28f4d2124c632b20cb9a404908282bef84c93b157d70eaf"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.365332 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.377320 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" event={"ID":"a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf","Type":"ContainerStarted","Data":"3b13fb23933d5a933f0b93ff2569a3bcd893eb9e6fb3c840bfe5f3bf7ec0c3c2"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.378151 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.388111 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.393405 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" podStartSLOduration=2.312063953 podStartE2EDuration="14.393383824s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.750173092 +0000 UTC m=+868.768513188" lastFinishedPulling="2026-02-02 10:53:13.831492963 +0000 UTC m=+880.849833059" observedRunningTime="2026-02-02 10:53:14.354755682 +0000 UTC m=+881.373095778" watchObservedRunningTime="2026-02-02 10:53:14.393383824 +0000 UTC m=+881.411723920" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.394075 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" event={"ID":"52cbccb0-76da-4a69-b33f-6efb03721afe","Type":"ContainerStarted","Data":"fca6b319d69208103f94b4ee277b17e79839355c5ce86c8df9df5aeaade68da5"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.394942 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.400159 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" podStartSLOduration=2.849760157 podStartE2EDuration="14.400140684s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.282823419 +0000 UTC m=+869.301163535" lastFinishedPulling="2026-02-02 10:53:13.833203966 +0000 UTC m=+880.851544062" observedRunningTime="2026-02-02 10:53:14.383960167 +0000 UTC m=+881.402300263" watchObservedRunningTime="2026-02-02 10:53:14.400140684 +0000 UTC m=+881.418480790" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.412371 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" podStartSLOduration=2.721393701 podStartE2EDuration="14.412354292s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.137363281 +0000 UTC m=+869.155703377" lastFinishedPulling="2026-02-02 10:53:13.828323872 +0000 UTC m=+880.846663968" observedRunningTime="2026-02-02 10:53:14.407188562 +0000 UTC m=+881.425528658" watchObservedRunningTime="2026-02-02 10:53:14.412354292 +0000 UTC m=+881.430694388" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.414668 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" event={"ID":"4d052f6a-df39-4bd2-aee5-8abd7a1a2882","Type":"ContainerStarted","Data":"e086c612f2a15e57944452b09008b25ec980e3331f7a0dc7e2e14e5c154b5864"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.414862 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.420348 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" event={"ID":"50ac57c2-233a-40b9-9377-c8066412240c","Type":"ContainerStarted","Data":"065fe126cf41fe9db084373fb1701b8d223607c83b3966b02b0e0c0ff1f800d9"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.421167 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.426029 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" event={"ID":"3676f3d9-d77a-4809-bf9e-0e5ba2bea27c","Type":"ContainerStarted","Data":"14eed61f56f0f70687e7446160929e87a028c8ed1d7e7c1fb1b4898515f518de"} Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.426945 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.439021 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" podStartSLOduration=2.936642972 podStartE2EDuration="14.439010062s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.389146842 +0000 UTC m=+869.407486938" lastFinishedPulling="2026-02-02 10:53:13.891513932 +0000 UTC m=+880.909854028" observedRunningTime="2026-02-02 10:53:14.437707179 +0000 UTC m=+881.456047285" watchObservedRunningTime="2026-02-02 10:53:14.439010062 +0000 UTC m=+881.457350158" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.479884 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" podStartSLOduration=2.78549878 podStartE2EDuration="14.479865319s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.137098183 +0000 UTC m=+869.155438279" lastFinishedPulling="2026-02-02 10:53:13.831464722 +0000 UTC m=+880.849804818" observedRunningTime="2026-02-02 10:53:14.47432711 +0000 UTC m=+881.492667216" watchObservedRunningTime="2026-02-02 10:53:14.479865319 +0000 UTC m=+881.498205415" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.481771 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" podStartSLOduration=2.922414564 podStartE2EDuration="14.481763657s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.27215015 +0000 UTC m=+869.290490246" lastFinishedPulling="2026-02-02 10:53:13.831499243 +0000 UTC m=+880.849839339" observedRunningTime="2026-02-02 10:53:14.462872032 +0000 UTC m=+881.481212128" watchObservedRunningTime="2026-02-02 10:53:14.481763657 +0000 UTC m=+881.500103753" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.490983 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" podStartSLOduration=3.052781664 podStartE2EDuration="14.490963629s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.393244986 +0000 UTC m=+869.411585082" lastFinishedPulling="2026-02-02 10:53:13.831426951 +0000 UTC m=+880.849767047" observedRunningTime="2026-02-02 10:53:14.489294607 +0000 UTC m=+881.507634703" watchObservedRunningTime="2026-02-02 10:53:14.490963629 +0000 UTC m=+881.509303725" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.511109 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" podStartSLOduration=2.954538572 podStartE2EDuration="14.511079144s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.27573666 +0000 UTC m=+869.294076756" lastFinishedPulling="2026-02-02 10:53:13.832277222 +0000 UTC m=+880.850617328" observedRunningTime="2026-02-02 10:53:14.510572712 +0000 UTC m=+881.528912808" watchObservedRunningTime="2026-02-02 10:53:14.511079144 +0000 UTC m=+881.529419240" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.537905 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" podStartSLOduration=2.9449273 podStartE2EDuration="14.537887728s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.38146715 +0000 UTC m=+869.399807246" lastFinishedPulling="2026-02-02 10:53:13.974427578 +0000 UTC m=+880.992767674" observedRunningTime="2026-02-02 10:53:14.536426952 +0000 UTC m=+881.554767048" watchObservedRunningTime="2026-02-02 10:53:14.537887728 +0000 UTC m=+881.556227824" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.552800 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" podStartSLOduration=2.894397621 podStartE2EDuration="14.552768203s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.162889253 +0000 UTC m=+869.181229349" lastFinishedPulling="2026-02-02 10:53:13.821259835 +0000 UTC m=+880.839599931" observedRunningTime="2026-02-02 10:53:14.552055725 +0000 UTC m=+881.570395821" watchObservedRunningTime="2026-02-02 10:53:14.552768203 +0000 UTC m=+881.571108289" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.608703 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" podStartSLOduration=3.07134585 podStartE2EDuration="14.608683439s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.752206044 +0000 UTC m=+868.770546140" lastFinishedPulling="2026-02-02 10:53:13.289543633 +0000 UTC m=+880.307883729" observedRunningTime="2026-02-02 10:53:14.606952156 +0000 UTC m=+881.625292252" watchObservedRunningTime="2026-02-02 10:53:14.608683439 +0000 UTC m=+881.627023555" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.658134 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" podStartSLOduration=3.004231342 podStartE2EDuration="14.658117712s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.162970904 +0000 UTC m=+869.181311000" lastFinishedPulling="2026-02-02 10:53:13.816857274 +0000 UTC m=+880.835197370" observedRunningTime="2026-02-02 10:53:14.653106057 +0000 UTC m=+881.671446153" watchObservedRunningTime="2026-02-02 10:53:14.658117712 +0000 UTC m=+881.676457808" Feb 02 10:53:14 crc kubenswrapper[4901]: I0202 10:53:14.679901 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" podStartSLOduration=3.2366872190000002 podStartE2EDuration="14.67988351s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.388824345 +0000 UTC m=+869.407164441" lastFinishedPulling="2026-02-02 10:53:13.832020636 +0000 UTC m=+880.850360732" observedRunningTime="2026-02-02 10:53:14.676699639 +0000 UTC m=+881.695039725" watchObservedRunningTime="2026-02-02 10:53:14.67988351 +0000 UTC m=+881.698223596" Feb 02 10:53:15 crc kubenswrapper[4901]: I0202 10:53:15.433705 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" event={"ID":"f5f1df90-8dfb-4eae-b0bb-6128aab24030","Type":"ContainerStarted","Data":"38b92105111b7fe3bb17c9f13e7de939d2c9c701670b43eeff434c977128cbb5"} Feb 02 10:53:16 crc kubenswrapper[4901]: I0202 10:53:16.710333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:16 crc kubenswrapper[4901]: E0202 10:53:16.710683 4901 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:16 crc kubenswrapper[4901]: E0202 10:53:16.710769 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert podName:42ecefa7-b29d-4178-82c0-5520874c1d1a nodeName:}" failed. No retries permitted until 2026-02-02 10:53:32.710740857 +0000 UTC m=+899.729080993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert") pod "infra-operator-controller-manager-79955696d6-vd4rg" (UID: "42ecefa7-b29d-4178-82c0-5520874c1d1a") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:53:16 crc kubenswrapper[4901]: I0202 10:53:16.913223 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:16 crc kubenswrapper[4901]: I0202 10:53:16.921211 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c4d76b0-aadf-4949-a131-a43c226e38a2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss\" (UID: \"4c4d76b0-aadf-4949-a131-a43c226e38a2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:16 crc kubenswrapper[4901]: I0202 10:53:16.982612 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.319634 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.320036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.325539 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-metrics-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.325639 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/626937bd-8794-43dd-ab0a-77a94440bb05-webhook-certs\") pod \"openstack-operator-controller-manager-6765c97497-ngsw7\" (UID: \"626937bd-8794-43dd-ab0a-77a94440bb05\") " pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.351680 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.407842 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss"] Feb 02 10:53:17 crc kubenswrapper[4901]: W0202 10:53:17.408229 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c4d76b0_aadf_4949_a131_a43c226e38a2.slice/crio-8c3ebcfba11992771f87a340ee7ea81e30d77891dacbec0899520404ce39b24f WatchSource:0}: Error finding container 8c3ebcfba11992771f87a340ee7ea81e30d77891dacbec0899520404ce39b24f: Status 404 returned error can't find the container with id 8c3ebcfba11992771f87a340ee7ea81e30d77891dacbec0899520404ce39b24f Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.451877 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" event={"ID":"4c4d76b0-aadf-4949-a131-a43c226e38a2","Type":"ContainerStarted","Data":"8c3ebcfba11992771f87a340ee7ea81e30d77891dacbec0899520404ce39b24f"} Feb 02 10:53:17 crc kubenswrapper[4901]: I0202 10:53:17.613744 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7"] Feb 02 10:53:18 crc kubenswrapper[4901]: I0202 10:53:18.459037 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" event={"ID":"626937bd-8794-43dd-ab0a-77a94440bb05","Type":"ContainerStarted","Data":"fc61e2a8744c49a213562c43baf92fd559ed982589dc9a7a77c5f88ae946bbe5"} Feb 02 10:53:20 crc kubenswrapper[4901]: I0202 10:53:20.913194 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-lh4js" Feb 02 10:53:20 crc kubenswrapper[4901]: I0202 10:53:20.933331 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-kgsvf" Feb 02 10:53:20 crc kubenswrapper[4901]: I0202 10:53:20.960447 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-b6wnv" Feb 02 10:53:20 crc kubenswrapper[4901]: I0202 10:53:20.997090 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-rjvw7" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.017587 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-7ssmx" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.044088 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-sm2xf" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.061624 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-wccg8" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.128717 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-f5dcq" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.137459 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-qnjn5" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.274629 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-rmwtz" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.337843 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-vf462" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.354831 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-7ktm2" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.459837 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-xkdvn" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.491575 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" event={"ID":"626937bd-8794-43dd-ab0a-77a94440bb05","Type":"ContainerStarted","Data":"931c776ca9b507ce9d5099d056c865ad564a51542f372148ab57f9b636f82969"} Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.492452 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:21 crc kubenswrapper[4901]: I0202 10:53:21.522580 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" podStartSLOduration=20.522548276 podStartE2EDuration="20.522548276s" podCreationTimestamp="2026-02-02 10:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:21.520226657 +0000 UTC m=+888.538566753" watchObservedRunningTime="2026-02-02 10:53:21.522548276 +0000 UTC m=+888.540888362" Feb 02 10:53:27 crc kubenswrapper[4901]: I0202 10:53:27.357015 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6765c97497-ngsw7" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.545028 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" event={"ID":"9f38ac2f-605c-413e-8bdc-ae236d52bd55","Type":"ContainerStarted","Data":"0811733214f0efa1ce7b325d2f278c40498cec44983aeac0e29045373db71155"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.545541 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.548625 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" event={"ID":"2d39bba8-c1e7-4247-b938-616c9774c9a7","Type":"ContainerStarted","Data":"4287e894e5dcb6b01e46671667e481372a2d13adc06483c437240b1cf236ee35"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.551015 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" event={"ID":"93760c22-c570-47d0-a0a8-a0e089ee1461","Type":"ContainerStarted","Data":"68e4608136f7d4242e5217909c0cb8a277adba9a55a021a7adf16c705b6917fb"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.551212 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.553004 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" event={"ID":"3952ac22-26a4-4b08-a45c-9d8db8597333","Type":"ContainerStarted","Data":"ca463358d6deb07d941e82fd1b7e04ec037d93a966ac98c2dfc2083840d1f94a"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.553222 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.554270 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" event={"ID":"27222341-69e8-4b3c-b6c2-e3d5c644e8c3","Type":"ContainerStarted","Data":"3e62eceb9bec6afc6fd1c0c5c37cc3459f295832bbfd49213409288852a8024b"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.554590 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.555520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" event={"ID":"c92912da-94c2-41b5-b43c-a136f96dbd1e","Type":"ContainerStarted","Data":"7e3ad8bd4e9e44a9e446ada39a5f3c468c4734f60db1b796e49505a80e4ec6f0"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.555663 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.557579 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" event={"ID":"9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d","Type":"ContainerStarted","Data":"020d4cc8f98ca8bf3309c9e0b211c52c843876d6bf40ba784ee7cf1005b48a8c"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.557793 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.559225 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" event={"ID":"4c4d76b0-aadf-4949-a131-a43c226e38a2","Type":"ContainerStarted","Data":"68d3fc8fe443c076f64aabd3176404b61d9af95ed66ec4212235643b716dae01"} Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.559533 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.562499 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" podStartSLOduration=2.544347707 podStartE2EDuration="27.562481483s" podCreationTimestamp="2026-02-02 10:53:01 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.520477236 +0000 UTC m=+869.538817332" lastFinishedPulling="2026-02-02 10:53:27.538611012 +0000 UTC m=+894.556951108" observedRunningTime="2026-02-02 10:53:28.559743774 +0000 UTC m=+895.578083870" watchObservedRunningTime="2026-02-02 10:53:28.562481483 +0000 UTC m=+895.580821579" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.576750 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" podStartSLOduration=3.455228405 podStartE2EDuration="28.576729221s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.404362485 +0000 UTC m=+869.422702571" lastFinishedPulling="2026-02-02 10:53:27.525863291 +0000 UTC m=+894.544203387" observedRunningTime="2026-02-02 10:53:28.575624073 +0000 UTC m=+895.593964169" watchObservedRunningTime="2026-02-02 10:53:28.576729221 +0000 UTC m=+895.595069317" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.638653 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" podStartSLOduration=18.524905754 podStartE2EDuration="28.638635358s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:17.412144717 +0000 UTC m=+884.430484813" lastFinishedPulling="2026-02-02 10:53:27.525874311 +0000 UTC m=+894.544214417" observedRunningTime="2026-02-02 10:53:28.636243347 +0000 UTC m=+895.654583443" watchObservedRunningTime="2026-02-02 10:53:28.638635358 +0000 UTC m=+895.656975454" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.640311 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" podStartSLOduration=2.594608071 podStartE2EDuration="27.64030452s" podCreationTimestamp="2026-02-02 10:53:01 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.525764319 +0000 UTC m=+869.544104415" lastFinishedPulling="2026-02-02 10:53:27.571460768 +0000 UTC m=+894.589800864" observedRunningTime="2026-02-02 10:53:28.604395767 +0000 UTC m=+895.622735863" watchObservedRunningTime="2026-02-02 10:53:28.64030452 +0000 UTC m=+895.658644616" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.652310 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" podStartSLOduration=3.513772088 podStartE2EDuration="28.652295772s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.400784566 +0000 UTC m=+869.419124662" lastFinishedPulling="2026-02-02 10:53:27.53930825 +0000 UTC m=+894.557648346" observedRunningTime="2026-02-02 10:53:28.651103322 +0000 UTC m=+895.669443438" watchObservedRunningTime="2026-02-02 10:53:28.652295772 +0000 UTC m=+895.670635868" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.664518 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" podStartSLOduration=3.713129263 podStartE2EDuration="28.664498409s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.542044039 +0000 UTC m=+869.560384135" lastFinishedPulling="2026-02-02 10:53:27.493413185 +0000 UTC m=+894.511753281" observedRunningTime="2026-02-02 10:53:28.663510194 +0000 UTC m=+895.681850310" watchObservedRunningTime="2026-02-02 10:53:28.664498409 +0000 UTC m=+895.682838505" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.687765 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" podStartSLOduration=7.702183957 podStartE2EDuration="28.687738133s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.54451837 +0000 UTC m=+869.562858466" lastFinishedPulling="2026-02-02 10:53:23.530072556 +0000 UTC m=+890.548412642" observedRunningTime="2026-02-02 10:53:28.675409513 +0000 UTC m=+895.693749609" watchObservedRunningTime="2026-02-02 10:53:28.687738133 +0000 UTC m=+895.706078229" Feb 02 10:53:28 crc kubenswrapper[4901]: I0202 10:53:28.698236 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6td2q" podStartSLOduration=2.6590674930000002 podStartE2EDuration="27.698210867s" podCreationTimestamp="2026-02-02 10:53:01 +0000 UTC" firstStartedPulling="2026-02-02 10:53:02.531619706 +0000 UTC m=+869.549959802" lastFinishedPulling="2026-02-02 10:53:27.57076308 +0000 UTC m=+894.589103176" observedRunningTime="2026-02-02 10:53:28.695652412 +0000 UTC m=+895.713992528" watchObservedRunningTime="2026-02-02 10:53:28.698210867 +0000 UTC m=+895.716550973" Feb 02 10:53:32 crc kubenswrapper[4901]: I0202 10:53:32.799837 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:32 crc kubenswrapper[4901]: I0202 10:53:32.813473 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42ecefa7-b29d-4178-82c0-5520874c1d1a-cert\") pod \"infra-operator-controller-manager-79955696d6-vd4rg\" (UID: \"42ecefa7-b29d-4178-82c0-5520874c1d1a\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:32 crc kubenswrapper[4901]: I0202 10:53:32.878050 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:33 crc kubenswrapper[4901]: I0202 10:53:33.341098 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg"] Feb 02 10:53:33 crc kubenswrapper[4901]: W0202 10:53:33.347786 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ecefa7_b29d_4178_82c0_5520874c1d1a.slice/crio-be2dfb17b13cddbbae32e769ef56b2db1e82ac77489bd6daacab3a4603066696 WatchSource:0}: Error finding container be2dfb17b13cddbbae32e769ef56b2db1e82ac77489bd6daacab3a4603066696: Status 404 returned error can't find the container with id be2dfb17b13cddbbae32e769ef56b2db1e82ac77489bd6daacab3a4603066696 Feb 02 10:53:33 crc kubenswrapper[4901]: I0202 10:53:33.604629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" event={"ID":"42ecefa7-b29d-4178-82c0-5520874c1d1a","Type":"ContainerStarted","Data":"be2dfb17b13cddbbae32e769ef56b2db1e82ac77489bd6daacab3a4603066696"} Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.555765 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.559141 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.562965 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.726964 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.727147 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv64\" (UniqueName: \"kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.727193 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.828237 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsv64\" (UniqueName: \"kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.828291 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.828326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.829075 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.829100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.848744 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsv64\" (UniqueName: \"kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64\") pod \"community-operators-lrlcj\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:34 crc kubenswrapper[4901]: I0202 10:53:34.891344 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:35 crc kubenswrapper[4901]: I0202 10:53:35.636142 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" event={"ID":"42ecefa7-b29d-4178-82c0-5520874c1d1a","Type":"ContainerStarted","Data":"e7db8b8665357fce36ae48cd047120db04b29a594e21fc38d369d2d82c15e320"} Feb 02 10:53:35 crc kubenswrapper[4901]: I0202 10:53:35.636742 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:35 crc kubenswrapper[4901]: I0202 10:53:35.660973 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" podStartSLOduration=33.576241792 podStartE2EDuration="35.660949203s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:33.351680213 +0000 UTC m=+900.370020319" lastFinishedPulling="2026-02-02 10:53:35.436387634 +0000 UTC m=+902.454727730" observedRunningTime="2026-02-02 10:53:35.655147856 +0000 UTC m=+902.673487982" watchObservedRunningTime="2026-02-02 10:53:35.660949203 +0000 UTC m=+902.679289299" Feb 02 10:53:35 crc kubenswrapper[4901]: W0202 10:53:35.689760 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab47f1de_d715_4510_b398_2c7b18a2f23b.slice/crio-422c4b21167f1e7d431aceda93a608793d08e58f40137228730725ddc0970705 WatchSource:0}: Error finding container 422c4b21167f1e7d431aceda93a608793d08e58f40137228730725ddc0970705: Status 404 returned error can't find the container with id 422c4b21167f1e7d431aceda93a608793d08e58f40137228730725ddc0970705 Feb 02 10:53:35 crc kubenswrapper[4901]: I0202 10:53:35.695199 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:36 crc kubenswrapper[4901]: I0202 10:53:36.652111 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerID="ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492" exitCode=0 Feb 02 10:53:36 crc kubenswrapper[4901]: I0202 10:53:36.654018 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerDied","Data":"ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492"} Feb 02 10:53:36 crc kubenswrapper[4901]: I0202 10:53:36.654175 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerStarted","Data":"422c4b21167f1e7d431aceda93a608793d08e58f40137228730725ddc0970705"} Feb 02 10:53:36 crc kubenswrapper[4901]: I0202 10:53:36.992731 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss" Feb 02 10:53:37 crc kubenswrapper[4901]: I0202 10:53:37.660040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerStarted","Data":"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537"} Feb 02 10:53:38 crc kubenswrapper[4901]: I0202 10:53:38.667732 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerID="33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537" exitCode=0 Feb 02 10:53:38 crc kubenswrapper[4901]: I0202 10:53:38.667789 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerDied","Data":"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537"} Feb 02 10:53:39 crc kubenswrapper[4901]: I0202 10:53:39.691531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerStarted","Data":"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349"} Feb 02 10:53:39 crc kubenswrapper[4901]: I0202 10:53:39.718609 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrlcj" podStartSLOduration=3.281153501 podStartE2EDuration="5.718556143s" podCreationTimestamp="2026-02-02 10:53:34 +0000 UTC" firstStartedPulling="2026-02-02 10:53:36.656924141 +0000 UTC m=+903.675264237" lastFinishedPulling="2026-02-02 10:53:39.094326783 +0000 UTC m=+906.112666879" observedRunningTime="2026-02-02 10:53:39.713484735 +0000 UTC m=+906.731824851" watchObservedRunningTime="2026-02-02 10:53:39.718556143 +0000 UTC m=+906.736896259" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.369473 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-827zp" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.410639 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-vccwc" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.441425 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-dxjpt" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.530192 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-84dbcd4d6-strlk" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.580558 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-vtjmm" Feb 02 10:53:41 crc kubenswrapper[4901]: I0202 10:53:41.598801 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-njrpb" Feb 02 10:53:42 crc kubenswrapper[4901]: I0202 10:53:42.885303 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-vd4rg" Feb 02 10:53:44 crc kubenswrapper[4901]: I0202 10:53:44.893635 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:44 crc kubenswrapper[4901]: I0202 10:53:44.894260 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:44 crc kubenswrapper[4901]: I0202 10:53:44.970822 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:45 crc kubenswrapper[4901]: I0202 10:53:45.795454 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:45 crc kubenswrapper[4901]: I0202 10:53:45.848744 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:47 crc kubenswrapper[4901]: I0202 10:53:47.773491 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrlcj" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="registry-server" containerID="cri-o://01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349" gracePeriod=2 Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.714404 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.785221 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerID="01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349" exitCode=0 Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.785263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerDied","Data":"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349"} Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.785290 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrlcj" event={"ID":"ab47f1de-d715-4510-b398-2c7b18a2f23b","Type":"ContainerDied","Data":"422c4b21167f1e7d431aceda93a608793d08e58f40137228730725ddc0970705"} Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.785309 4901 scope.go:117] "RemoveContainer" containerID="01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.785418 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrlcj" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.800995 4901 scope.go:117] "RemoveContainer" containerID="33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.814930 4901 scope.go:117] "RemoveContainer" containerID="ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.854549 4901 scope.go:117] "RemoveContainer" containerID="01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349" Feb 02 10:53:48 crc kubenswrapper[4901]: E0202 10:53:48.855138 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349\": container with ID starting with 01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349 not found: ID does not exist" containerID="01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.855183 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349"} err="failed to get container status \"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349\": rpc error: code = NotFound desc = could not find container \"01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349\": container with ID starting with 01b000502076360d72e074f48d61292cd0a08d32013efbcf24016763976b3349 not found: ID does not exist" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.855209 4901 scope.go:117] "RemoveContainer" containerID="33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537" Feb 02 10:53:48 crc kubenswrapper[4901]: E0202 10:53:48.855620 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537\": container with ID starting with 33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537 not found: ID does not exist" containerID="33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.855653 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537"} err="failed to get container status \"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537\": rpc error: code = NotFound desc = could not find container \"33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537\": container with ID starting with 33a3b7de25d4c5f6ed56fb54806971d592934a497ef3b630df13eb13457f0537 not found: ID does not exist" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.855670 4901 scope.go:117] "RemoveContainer" containerID="ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492" Feb 02 10:53:48 crc kubenswrapper[4901]: E0202 10:53:48.855909 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492\": container with ID starting with ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492 not found: ID does not exist" containerID="ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.855945 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492"} err="failed to get container status \"ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492\": rpc error: code = NotFound desc = could not find container \"ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492\": container with ID starting with ffb50bee54fe1f47ffca3e163c0e155fb7347bebc82c6ec66632e4f3cba88492 not found: ID does not exist" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.865112 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities\") pod \"ab47f1de-d715-4510-b398-2c7b18a2f23b\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.865221 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsv64\" (UniqueName: \"kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64\") pod \"ab47f1de-d715-4510-b398-2c7b18a2f23b\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.865311 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content\") pod \"ab47f1de-d715-4510-b398-2c7b18a2f23b\" (UID: \"ab47f1de-d715-4510-b398-2c7b18a2f23b\") " Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.866420 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities" (OuterVolumeSpecName: "utilities") pod "ab47f1de-d715-4510-b398-2c7b18a2f23b" (UID: "ab47f1de-d715-4510-b398-2c7b18a2f23b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.874785 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64" (OuterVolumeSpecName: "kube-api-access-lsv64") pod "ab47f1de-d715-4510-b398-2c7b18a2f23b" (UID: "ab47f1de-d715-4510-b398-2c7b18a2f23b"). InnerVolumeSpecName "kube-api-access-lsv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.949236 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab47f1de-d715-4510-b398-2c7b18a2f23b" (UID: "ab47f1de-d715-4510-b398-2c7b18a2f23b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.967229 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.967278 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab47f1de-d715-4510-b398-2c7b18a2f23b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:48 crc kubenswrapper[4901]: I0202 10:53:48.967293 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsv64\" (UniqueName: \"kubernetes.io/projected/ab47f1de-d715-4510-b398-2c7b18a2f23b-kube-api-access-lsv64\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:49 crc kubenswrapper[4901]: I0202 10:53:49.115645 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:49 crc kubenswrapper[4901]: I0202 10:53:49.125864 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrlcj"] Feb 02 10:53:49 crc kubenswrapper[4901]: I0202 10:53:49.686252 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" path="/var/lib/kubelet/pods/ab47f1de-d715-4510-b398-2c7b18a2f23b/volumes" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.173818 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:53:52 crc kubenswrapper[4901]: E0202 10:53:52.174474 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="extract-utilities" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.174486 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="extract-utilities" Feb 02 10:53:52 crc kubenswrapper[4901]: E0202 10:53:52.174507 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="registry-server" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.174513 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="registry-server" Feb 02 10:53:52 crc kubenswrapper[4901]: E0202 10:53:52.174527 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="extract-content" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.174533 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="extract-content" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.174687 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab47f1de-d715-4510-b398-2c7b18a2f23b" containerName="registry-server" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.176361 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.212947 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.321916 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22dp6\" (UniqueName: \"kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.322006 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.322036 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.423522 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.423590 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.423671 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22dp6\" (UniqueName: \"kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.424469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.424697 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.443767 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22dp6\" (UniqueName: \"kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6\") pod \"certified-operators-zbmd6\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.513247 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:53:52 crc kubenswrapper[4901]: I0202 10:53:52.823262 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:53:53 crc kubenswrapper[4901]: I0202 10:53:53.825288 4901 generic.go:334] "Generic (PLEG): container finished" podID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerID="ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771" exitCode=0 Feb 02 10:53:53 crc kubenswrapper[4901]: I0202 10:53:53.825384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerDied","Data":"ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771"} Feb 02 10:53:53 crc kubenswrapper[4901]: I0202 10:53:53.825683 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerStarted","Data":"e62d4e8011d7d64db5155969f751d3dbcd8133e94d992ff872eca3624fe2e3a4"} Feb 02 10:53:53 crc kubenswrapper[4901]: I0202 10:53:53.827700 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:53:54 crc kubenswrapper[4901]: I0202 10:53:54.838993 4901 generic.go:334] "Generic (PLEG): container finished" podID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerID="1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330" exitCode=0 Feb 02 10:53:54 crc kubenswrapper[4901]: I0202 10:53:54.839074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerDied","Data":"1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330"} Feb 02 10:53:55 crc kubenswrapper[4901]: I0202 10:53:55.918991 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerStarted","Data":"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787"} Feb 02 10:53:55 crc kubenswrapper[4901]: I0202 10:53:55.957097 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbmd6" podStartSLOduration=2.561697648 podStartE2EDuration="3.957078511s" podCreationTimestamp="2026-02-02 10:53:52 +0000 UTC" firstStartedPulling="2026-02-02 10:53:53.827461598 +0000 UTC m=+920.845801694" lastFinishedPulling="2026-02-02 10:53:55.222842451 +0000 UTC m=+922.241182557" observedRunningTime="2026-02-02 10:53:55.955823199 +0000 UTC m=+922.974163285" watchObservedRunningTime="2026-02-02 10:53:55.957078511 +0000 UTC m=+922.975418607" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.150639 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.152789 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.155597 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.159931 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.160191 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.160310 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qf8x5" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.172608 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.206082 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.220325 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.224800 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.234191 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.238995 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhs6\" (UniqueName: \"kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.239096 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.340492 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbkn\" (UniqueName: \"kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.340626 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhs6\" (UniqueName: \"kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.340653 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.340694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.340724 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.341837 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.358368 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhs6\" (UniqueName: \"kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6\") pod \"dnsmasq-dns-675f4bcbfc-qw8r8\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.441607 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.441672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbkn\" (UniqueName: \"kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.441735 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.442471 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.442552 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.458642 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbkn\" (UniqueName: \"kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn\") pod \"dnsmasq-dns-78dd6ddcc-c7dhq\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.495450 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:00 crc kubenswrapper[4901]: I0202 10:54:00.577762 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:01 crc kubenswrapper[4901]: I0202 10:54:01.029152 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:01 crc kubenswrapper[4901]: I0202 10:54:01.094536 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:01 crc kubenswrapper[4901]: W0202 10:54:01.105275 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26df7306_345d_425d_8f8f_bd026582c25f.slice/crio-a058889e48c466d3fb241efee3f839b018d2c76c5e603c52790e57deceb03def WatchSource:0}: Error finding container a058889e48c466d3fb241efee3f839b018d2c76c5e603c52790e57deceb03def: Status 404 returned error can't find the container with id a058889e48c466d3fb241efee3f839b018d2c76c5e603c52790e57deceb03def Feb 02 10:54:01 crc kubenswrapper[4901]: I0202 10:54:01.971156 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" event={"ID":"feade5e7-72de-4496-a324-967c6727f8d7","Type":"ContainerStarted","Data":"6003bd986245257103abe24be4f7293f43240cb3772fe66336bd336ef6c6d809"} Feb 02 10:54:01 crc kubenswrapper[4901]: I0202 10:54:01.972703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" event={"ID":"26df7306-345d-425d-8f8f-bd026582c25f","Type":"ContainerStarted","Data":"a058889e48c466d3fb241efee3f839b018d2c76c5e603c52790e57deceb03def"} Feb 02 10:54:02 crc kubenswrapper[4901]: I0202 10:54:02.514293 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:02 crc kubenswrapper[4901]: I0202 10:54:02.514593 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:02 crc kubenswrapper[4901]: I0202 10:54:02.579235 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.086164 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.120944 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.122958 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.136855 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.144557 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.183949 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.183994 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wpkm\" (UniqueName: \"kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.184065 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.264135 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.284860 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.284909 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wpkm\" (UniqueName: \"kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.284965 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.285946 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.286211 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.304470 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wpkm\" (UniqueName: \"kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm\") pod \"dnsmasq-dns-666b6646f7-6v2gd\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.460530 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.481025 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.514182 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.515668 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.545891 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.595799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4qc\" (UniqueName: \"kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.595862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.595901 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.706298 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4qc\" (UniqueName: \"kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.706428 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.706528 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.707945 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.708942 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.748356 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4qc\" (UniqueName: \"kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc\") pod \"dnsmasq-dns-57d769cc4f-f9hjt\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:03 crc kubenswrapper[4901]: I0202 10:54:03.905621 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.106265 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.250777 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.253188 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.257915 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258066 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258120 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-brdmq" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258155 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258303 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258447 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.258500 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.267807 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321716 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhr9n\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321766 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321792 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321811 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321851 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321871 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321886 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321943 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.321977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.322000 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.322020 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.399370 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:04 crc kubenswrapper[4901]: W0202 10:54:04.401433 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod984f7ae8_8c3c_41ec_85a4_dd0b31764800.slice/crio-3eb28519d6186cbcf52a3bf4c06c2f236c704456af8edd924a88007a3c138d0f WatchSource:0}: Error finding container 3eb28519d6186cbcf52a3bf4c06c2f236c704456af8edd924a88007a3c138d0f: Status 404 returned error can't find the container with id 3eb28519d6186cbcf52a3bf4c06c2f236c704456af8edd924a88007a3c138d0f Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423217 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423285 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423325 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423352 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423378 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhr9n\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423399 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423418 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423443 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423499 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.423524 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.426524 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.426673 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.427201 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.427481 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.427586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.428589 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.432046 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.435132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.435130 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.449087 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhr9n\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.454794 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.459686 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.587418 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.630038 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.631687 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.636950 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.637156 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.637450 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mm74n" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.637619 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.637724 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.637864 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.644856 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.669693 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.844865 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnvw\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845318 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845343 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845453 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845488 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845511 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.845558 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.846179 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.846348 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.948780 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.948826 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.948914 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.948959 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.948991 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949040 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949111 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949155 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949192 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnvw\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949229 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949484 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949647 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949855 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.949919 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.950692 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.951245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.958187 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.958747 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.958905 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.965403 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.967962 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnvw\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.972094 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:04 crc kubenswrapper[4901]: I0202 10:54:04.986001 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.092891 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" event={"ID":"9d5bfda5-826b-4d0d-81a6-87d5589f0c78","Type":"ContainerStarted","Data":"a7cafc663f56fa467669b1758494d5e8ed7c6a706f6550b4c150cbe483c9a56e"} Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.200703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" event={"ID":"984f7ae8-8c3c-41ec-85a4-dd0b31764800","Type":"ContainerStarted","Data":"3eb28519d6186cbcf52a3bf4c06c2f236c704456af8edd924a88007a3c138d0f"} Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.200944 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbmd6" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="registry-server" containerID="cri-o://66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787" gracePeriod=2 Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.252641 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.761751 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.763305 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.770362 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.771177 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-289tc" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.771496 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.772474 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.776978 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 02 10:54:05 crc kubenswrapper[4901]: W0202 10:54:05.778585 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod942c6932_383e_432a_b927_ff9ec4ac81cb.slice/crio-42360ba26c45137b286d35f556b3f1f16ab183c27aa68712ac5f83e021c40657 WatchSource:0}: Error finding container 42360ba26c45137b286d35f556b3f1f16ab183c27aa68712ac5f83e021c40657: Status 404 returned error can't find the container with id 42360ba26c45137b286d35f556b3f1f16ab183c27aa68712ac5f83e021c40657 Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.781512 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.801757 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.858969 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873406 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities\") pod \"ee96d54f-a561-4687-964a-623c5ff8ab88\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873489 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22dp6\" (UniqueName: \"kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6\") pod \"ee96d54f-a561-4687-964a-623c5ff8ab88\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873513 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content\") pod \"ee96d54f-a561-4687-964a-623c5ff8ab88\" (UID: \"ee96d54f-a561-4687-964a-623c5ff8ab88\") " Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873849 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-default\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873891 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873920 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873949 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz2f\" (UniqueName: \"kubernetes.io/projected/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kube-api-access-txz2f\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.873969 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.874009 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kolla-config\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.874023 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.874046 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.876657 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities" (OuterVolumeSpecName: "utilities") pod "ee96d54f-a561-4687-964a-623c5ff8ab88" (UID: "ee96d54f-a561-4687-964a-623c5ff8ab88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.881113 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6" (OuterVolumeSpecName: "kube-api-access-22dp6") pod "ee96d54f-a561-4687-964a-623c5ff8ab88" (UID: "ee96d54f-a561-4687-964a-623c5ff8ab88"). InnerVolumeSpecName "kube-api-access-22dp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.942770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee96d54f-a561-4687-964a-623c5ff8ab88" (UID: "ee96d54f-a561-4687-964a-623c5ff8ab88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975639 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975709 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-default\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975741 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz2f\" (UniqueName: \"kubernetes.io/projected/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kube-api-access-txz2f\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975828 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975943 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kolla-config\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975960 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.975999 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.976011 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22dp6\" (UniqueName: \"kubernetes.io/projected/ee96d54f-a561-4687-964a-623c5ff8ab88-kube-api-access-22dp6\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.976023 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96d54f-a561-4687-964a-623c5ff8ab88-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.976257 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.976708 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.977153 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kolla-config\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.977222 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-config-data-default\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.978392 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93ecc7a4-4c23-488f-8d75-8fee0246afe4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.981500 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.984718 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ecc7a4-4c23-488f-8d75-8fee0246afe4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:05 crc kubenswrapper[4901]: I0202 10:54:05.993506 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz2f\" (UniqueName: \"kubernetes.io/projected/93ecc7a4-4c23-488f-8d75-8fee0246afe4-kube-api-access-txz2f\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.004863 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"93ecc7a4-4c23-488f-8d75-8fee0246afe4\") " pod="openstack/openstack-galera-0" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.161093 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.231619 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerStarted","Data":"42360ba26c45137b286d35f556b3f1f16ab183c27aa68712ac5f83e021c40657"} Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.238809 4901 generic.go:334] "Generic (PLEG): container finished" podID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerID="66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787" exitCode=0 Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.238930 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbmd6" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.238923 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerDied","Data":"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787"} Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.239088 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbmd6" event={"ID":"ee96d54f-a561-4687-964a-623c5ff8ab88","Type":"ContainerDied","Data":"e62d4e8011d7d64db5155969f751d3dbcd8133e94d992ff872eca3624fe2e3a4"} Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.239142 4901 scope.go:117] "RemoveContainer" containerID="66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.240978 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerStarted","Data":"4f4405fae4718e8fe058b899df307f15ba42dffc224ee3c84fbfec9dc833150f"} Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.284443 4901 scope.go:117] "RemoveContainer" containerID="1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.292921 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.301647 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbmd6"] Feb 02 10:54:06 crc kubenswrapper[4901]: E0202 10:54:06.338807 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee96d54f_a561_4687_964a_623c5ff8ab88.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.484070 4901 scope.go:117] "RemoveContainer" containerID="ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.524188 4901 scope.go:117] "RemoveContainer" containerID="66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787" Feb 02 10:54:06 crc kubenswrapper[4901]: E0202 10:54:06.526607 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787\": container with ID starting with 66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787 not found: ID does not exist" containerID="66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.526641 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787"} err="failed to get container status \"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787\": rpc error: code = NotFound desc = could not find container \"66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787\": container with ID starting with 66b7f71c389d7542e54089f39c6e8f60a03fb1324c6c405ef120ac3572d31787 not found: ID does not exist" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.526663 4901 scope.go:117] "RemoveContainer" containerID="1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330" Feb 02 10:54:06 crc kubenswrapper[4901]: E0202 10:54:06.530309 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330\": container with ID starting with 1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330 not found: ID does not exist" containerID="1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.530340 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330"} err="failed to get container status \"1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330\": rpc error: code = NotFound desc = could not find container \"1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330\": container with ID starting with 1a1fdb9a25c29cccb1fb6ca28ee065045483928c3870c5d7485ee904e0539330 not found: ID does not exist" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.530358 4901 scope.go:117] "RemoveContainer" containerID="ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771" Feb 02 10:54:06 crc kubenswrapper[4901]: E0202 10:54:06.537746 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771\": container with ID starting with ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771 not found: ID does not exist" containerID="ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.537801 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771"} err="failed to get container status \"ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771\": rpc error: code = NotFound desc = could not find container \"ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771\": container with ID starting with ad77005c4f0e1214519aa6d7660b1aa2431d3e3f88900c984ec31e4c06d93771 not found: ID does not exist" Feb 02 10:54:06 crc kubenswrapper[4901]: I0202 10:54:06.870388 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.187829 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:07 crc kubenswrapper[4901]: E0202 10:54:07.188695 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="extract-utilities" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.188718 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="extract-utilities" Feb 02 10:54:07 crc kubenswrapper[4901]: E0202 10:54:07.188747 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="registry-server" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.188757 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="registry-server" Feb 02 10:54:07 crc kubenswrapper[4901]: E0202 10:54:07.188770 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="extract-content" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.188804 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="extract-content" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.189028 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" containerName="registry-server" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.190077 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.201393 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.202440 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.202595 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8tk68" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.202617 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.208070 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.275005 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93ecc7a4-4c23-488f-8d75-8fee0246afe4","Type":"ContainerStarted","Data":"836cb4507f5dffcdf4345fe074ef971f87399e2e7a9ee7655c9e8591e3722e5a"} Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327706 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327807 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327853 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327879 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327896 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327938 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcxw\" (UniqueName: \"kubernetes.io/projected/c56659df-71f1-4dbb-819f-b71277070b0e-kube-api-access-qtcxw\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.327974 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.328055 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429650 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429746 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429800 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429827 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429864 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcxw\" (UniqueName: \"kubernetes.io/projected/c56659df-71f1-4dbb-819f-b71277070b0e-kube-api-access-qtcxw\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429916 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.429969 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.431888 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.432281 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.435130 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c56659df-71f1-4dbb-819f-b71277070b0e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.438881 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.439319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c56659df-71f1-4dbb-819f-b71277070b0e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.448650 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.458332 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56659df-71f1-4dbb-819f-b71277070b0e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.472800 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.477689 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcxw\" (UniqueName: \"kubernetes.io/projected/c56659df-71f1-4dbb-819f-b71277070b0e-kube-api-access-qtcxw\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.491230 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.491339 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.497684 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-779sl" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.497914 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.498109 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.500283 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c56659df-71f1-4dbb-819f-b71277070b0e\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.533010 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.634088 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.634226 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.634260 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-kolla-config\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.634319 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-config-data\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.634347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fm5\" (UniqueName: \"kubernetes.io/projected/e7551d06-91c3-4652-b042-cf8080c36ce2-kube-api-access-f6fm5\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.716695 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee96d54f-a561-4687-964a-623c5ff8ab88" path="/var/lib/kubelet/pods/ee96d54f-a561-4687-964a-623c5ff8ab88/volumes" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.737703 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.737786 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-kolla-config\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.737878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-config-data\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.737920 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fm5\" (UniqueName: \"kubernetes.io/projected/e7551d06-91c3-4652-b042-cf8080c36ce2-kube-api-access-f6fm5\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.737953 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.740832 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-kolla-config\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.741779 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7551d06-91c3-4652-b042-cf8080c36ce2-config-data\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.748880 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.756019 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7551d06-91c3-4652-b042-cf8080c36ce2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.772306 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fm5\" (UniqueName: \"kubernetes.io/projected/e7551d06-91c3-4652-b042-cf8080c36ce2-kube-api-access-f6fm5\") pod \"memcached-0\" (UID: \"e7551d06-91c3-4652-b042-cf8080c36ce2\") " pod="openstack/memcached-0" Feb 02 10:54:07 crc kubenswrapper[4901]: I0202 10:54:07.893729 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.232436 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.234151 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.251016 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.252075 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.255123 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lrqjm" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.263054 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.272097 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.388126 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9t4h\" (UniqueName: \"kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.388209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.388257 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.388300 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58hw\" (UniqueName: \"kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw\") pod \"kube-state-metrics-0\" (UID: \"e251c5c1-5608-4bf0-8bfa-c056084f0bc0\") " pod="openstack/kube-state-metrics-0" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.490216 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9t4h\" (UniqueName: \"kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.490310 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.490356 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.490430 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b58hw\" (UniqueName: \"kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw\") pod \"kube-state-metrics-0\" (UID: \"e251c5c1-5608-4bf0-8bfa-c056084f0bc0\") " pod="openstack/kube-state-metrics-0" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.490923 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.491005 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.513260 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9t4h\" (UniqueName: \"kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h\") pod \"redhat-marketplace-bdrnz\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.516575 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58hw\" (UniqueName: \"kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw\") pod \"kube-state-metrics-0\" (UID: \"e251c5c1-5608-4bf0-8bfa-c056084f0bc0\") " pod="openstack/kube-state-metrics-0" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.569314 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:09 crc kubenswrapper[4901]: I0202 10:54:09.586674 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.425084 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.432860 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.446512 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.457290 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247cv\" (UniqueName: \"kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.464773 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.465091 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.566759 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247cv\" (UniqueName: \"kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.566890 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.566963 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.567558 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.567713 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.604032 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247cv\" (UniqueName: \"kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv\") pod \"redhat-operators-5mfk4\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.756224 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.957161 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rtv7m"] Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.958818 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.961895 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nhqz5" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.962330 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.962624 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.973369 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rtv7m"] Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974528 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-scripts\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974600 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974641 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-ovn-controller-tls-certs\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974744 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-combined-ca-bundle\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974805 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-log-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:12 crc kubenswrapper[4901]: I0202 10:54:12.974862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwb88\" (UniqueName: \"kubernetes.io/projected/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-kube-api-access-lwb88\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.056397 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-ckvwh"] Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.061516 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076594 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8984971d-92d8-4bf2-b07f-f8af49f67ece-scripts\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076670 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076697 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-combined-ca-bundle\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076746 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-log-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076788 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjd6t\" (UniqueName: \"kubernetes.io/projected/8984971d-92d8-4bf2-b07f-f8af49f67ece-kube-api-access-kjd6t\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-log\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076847 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwb88\" (UniqueName: \"kubernetes.io/projected/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-kube-api-access-lwb88\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076887 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-etc-ovs\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076920 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-scripts\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076944 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.076978 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-ovn-controller-tls-certs\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.077008 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-lib\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.077033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-run\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.077835 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.077992 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-run-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.078306 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-var-log-ovn\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.080183 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-scripts\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.081492 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ckvwh"] Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.083388 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-ovn-controller-tls-certs\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.089679 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-combined-ca-bundle\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.116381 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwb88\" (UniqueName: \"kubernetes.io/projected/bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7-kube-api-access-lwb88\") pod \"ovn-controller-rtv7m\" (UID: \"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7\") " pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.178960 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-etc-ovs\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179037 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-lib\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179097 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-run\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179160 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8984971d-92d8-4bf2-b07f-f8af49f67ece-scripts\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179240 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjd6t\" (UniqueName: \"kubernetes.io/projected/8984971d-92d8-4bf2-b07f-f8af49f67ece-kube-api-access-kjd6t\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179265 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-log\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179268 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-etc-ovs\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179576 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-lib\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179709 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-run\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.179877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8984971d-92d8-4bf2-b07f-f8af49f67ece-var-log\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.186013 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8984971d-92d8-4bf2-b07f-f8af49f67ece-scripts\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.206284 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjd6t\" (UniqueName: \"kubernetes.io/projected/8984971d-92d8-4bf2-b07f-f8af49f67ece-kube-api-access-kjd6t\") pod \"ovn-controller-ovs-ckvwh\" (UID: \"8984971d-92d8-4bf2-b07f-f8af49f67ece\") " pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.285199 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.454710 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.820401 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.822040 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.824970 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.825316 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.825985 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kn7q7" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.826182 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.830876 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.833044 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.906946 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907045 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907223 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907315 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907450 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtmxl\" (UniqueName: \"kubernetes.io/projected/5d81b635-e8be-4199-827b-02ea68a3de3b-kube-api-access-mtmxl\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907483 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:13 crc kubenswrapper[4901]: I0202 10:54:13.907526 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.010348 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.010712 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.010802 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.010939 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtmxl\" (UniqueName: \"kubernetes.io/projected/5d81b635-e8be-4199-827b-02ea68a3de3b-kube-api-access-mtmxl\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.011021 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.012165 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.011063 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.012392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.013586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.014932 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.014002 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.013401 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d81b635-e8be-4199-827b-02ea68a3de3b-config\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.022014 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.023217 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.029224 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d81b635-e8be-4199-827b-02ea68a3de3b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.029534 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtmxl\" (UniqueName: \"kubernetes.io/projected/5d81b635-e8be-4199-827b-02ea68a3de3b-kube-api-access-mtmxl\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.040226 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5d81b635-e8be-4199-827b-02ea68a3de3b\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:14 crc kubenswrapper[4901]: I0202 10:54:14.156204 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.062484 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.064414 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.067523 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.067817 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.067905 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.068366 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5phpg" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.071202 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.267840 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kp2n\" (UniqueName: \"kubernetes.io/projected/867ad59f-d6b4-42da-90e8-9ac943b12aaf-kube-api-access-5kp2n\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.268360 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.268400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.268434 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.268521 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.268782 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.269027 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.269144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371071 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371178 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371226 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371322 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371387 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371422 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kp2n\" (UniqueName: \"kubernetes.io/projected/867ad59f-d6b4-42da-90e8-9ac943b12aaf-kube-api-access-5kp2n\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371476 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.371649 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.372109 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.372295 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.372928 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/867ad59f-d6b4-42da-90e8-9ac943b12aaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.381357 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.386764 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.410423 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/867ad59f-d6b4-42da-90e8-9ac943b12aaf-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.439329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kp2n\" (UniqueName: \"kubernetes.io/projected/867ad59f-d6b4-42da-90e8-9ac943b12aaf-kube-api-access-5kp2n\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.494927 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"867ad59f-d6b4-42da-90e8-9ac943b12aaf\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:17 crc kubenswrapper[4901]: I0202 10:54:17.691669 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:23 crc kubenswrapper[4901]: E0202 10:54:23.102558 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 02 10:54:23 crc kubenswrapper[4901]: E0202 10:54:23.103340 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqnvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(942c6932-383e-432a-b927-ff9ec4ac81cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:23 crc kubenswrapper[4901]: E0202 10:54:23.104642 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" Feb 02 10:54:23 crc kubenswrapper[4901]: E0202 10:54:23.482122 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" Feb 02 10:54:30 crc kubenswrapper[4901]: E0202 10:54:30.026367 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 02 10:54:30 crc kubenswrapper[4901]: E0202 10:54:30.027465 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txz2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(93ecc7a4-4c23-488f-8d75-8fee0246afe4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:30 crc kubenswrapper[4901]: E0202 10:54:30.028720 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="93ecc7a4-4c23-488f-8d75-8fee0246afe4" Feb 02 10:54:30 crc kubenswrapper[4901]: E0202 10:54:30.569820 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="93ecc7a4-4c23-488f-8d75-8fee0246afe4" Feb 02 10:54:30 crc kubenswrapper[4901]: E0202 10:54:30.998928 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:30.999806 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvbkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-c7dhq_openstack(26df7306-345d-425d-8f8f-bd026582c25f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.001117 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" podUID="26df7306-345d-425d-8f8f-bd026582c25f" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.028351 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.028573 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lhs6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-qw8r8_openstack(feade5e7-72de-4496-a324-967c6727f8d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.030631 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" podUID="feade5e7-72de-4496-a324-967c6727f8d7" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.031150 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.031357 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bd4qc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-f9hjt_openstack(984f7ae8-8c3c-41ec-85a4-dd0b31764800): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.033329 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" podUID="984f7ae8-8c3c-41ec-85a4-dd0b31764800" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.041925 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.042120 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wpkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-6v2gd_openstack(9d5bfda5-826b-4d0d-81a6-87d5589f0c78): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.043784 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" podUID="9d5bfda5-826b-4d0d-81a6-87d5589f0c78" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.581957 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" podUID="9d5bfda5-826b-4d0d-81a6-87d5589f0c78" Feb 02 10:54:31 crc kubenswrapper[4901]: E0202 10:54:31.582919 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" podUID="984f7ae8-8c3c-41ec-85a4-dd0b31764800" Feb 02 10:54:31 crc kubenswrapper[4901]: I0202 10:54:31.940845 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:31 crc kubenswrapper[4901]: I0202 10:54:31.946873 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.078665 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.108374 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config\") pod \"feade5e7-72de-4496-a324-967c6727f8d7\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.108499 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lhs6\" (UniqueName: \"kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6\") pod \"feade5e7-72de-4496-a324-967c6727f8d7\" (UID: \"feade5e7-72de-4496-a324-967c6727f8d7\") " Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.108625 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbkn\" (UniqueName: \"kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn\") pod \"26df7306-345d-425d-8f8f-bd026582c25f\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.108736 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config\") pod \"26df7306-345d-425d-8f8f-bd026582c25f\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.108860 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc\") pod \"26df7306-345d-425d-8f8f-bd026582c25f\" (UID: \"26df7306-345d-425d-8f8f-bd026582c25f\") " Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.109966 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26df7306-345d-425d-8f8f-bd026582c25f" (UID: "26df7306-345d-425d-8f8f-bd026582c25f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.110597 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config" (OuterVolumeSpecName: "config") pod "26df7306-345d-425d-8f8f-bd026582c25f" (UID: "26df7306-345d-425d-8f8f-bd026582c25f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.112668 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config" (OuterVolumeSpecName: "config") pod "feade5e7-72de-4496-a324-967c6727f8d7" (UID: "feade5e7-72de-4496-a324-967c6727f8d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.119606 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6" (OuterVolumeSpecName: "kube-api-access-7lhs6") pod "feade5e7-72de-4496-a324-967c6727f8d7" (UID: "feade5e7-72de-4496-a324-967c6727f8d7"). InnerVolumeSpecName "kube-api-access-7lhs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.124733 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn" (OuterVolumeSpecName: "kube-api-access-cvbkn") pod "26df7306-345d-425d-8f8f-bd026582c25f" (UID: "26df7306-345d-425d-8f8f-bd026582c25f"). InnerVolumeSpecName "kube-api-access-cvbkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.129779 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.177354 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rtv7m"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.185793 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.202914 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.210744 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feade5e7-72de-4496-a324-967c6727f8d7-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.210787 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lhs6\" (UniqueName: \"kubernetes.io/projected/feade5e7-72de-4496-a324-967c6727f8d7-kube-api-access-7lhs6\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.210802 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbkn\" (UniqueName: \"kubernetes.io/projected/26df7306-345d-425d-8f8f-bd026582c25f-kube-api-access-cvbkn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.210813 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.210822 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26df7306-345d-425d-8f8f-bd026582c25f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.220117 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:32 crc kubenswrapper[4901]: W0202 10:54:32.230910 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdaf682c_0830_4cd9_ba3d_5eb615ac9cb7.slice/crio-a0d1cc851ac4fef8079cd6a4cbe2fa833ca9d356b7819480728763af3f223d22 WatchSource:0}: Error finding container a0d1cc851ac4fef8079cd6a4cbe2fa833ca9d356b7819480728763af3f223d22: Status 404 returned error can't find the container with id a0d1cc851ac4fef8079cd6a4cbe2fa833ca9d356b7819480728763af3f223d22 Feb 02 10:54:32 crc kubenswrapper[4901]: W0202 10:54:32.233407 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod715a873e_0260_4254_86a6_203a2e08e36d.slice/crio-5a2b7a98bc01d8c201dab8c8f8967d9ecee61c7760bc309116ce45482347cec7 WatchSource:0}: Error finding container 5a2b7a98bc01d8c201dab8c8f8967d9ecee61c7760bc309116ce45482347cec7: Status 404 returned error can't find the container with id 5a2b7a98bc01d8c201dab8c8f8967d9ecee61c7760bc309116ce45482347cec7 Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.306890 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.589039 4901 generic.go:334] "Generic (PLEG): container finished" podID="715a873e-0260-4254-86a6-203a2e08e36d" containerID="68fd3de3c14e0fa8fb682b22d2d7a2d89c07ff9e40187faa2e1fe0f3525bee11" exitCode=0 Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.589161 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerDied","Data":"68fd3de3c14e0fa8fb682b22d2d7a2d89c07ff9e40187faa2e1fe0f3525bee11"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.589234 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerStarted","Data":"5a2b7a98bc01d8c201dab8c8f8967d9ecee61c7760bc309116ce45482347cec7"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.598882 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c56659df-71f1-4dbb-819f-b71277070b0e","Type":"ContainerStarted","Data":"59774274818ad0fb673f622c180b0d0542aae50a446b2b43b00eb88f2ea664cc"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.602708 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e7551d06-91c3-4652-b042-cf8080c36ce2","Type":"ContainerStarted","Data":"9ac00e1ba0a8e015ed51cf00ffbaee47ab41f1a735b9e725a751bd2341f9a76c"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.622911 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e251c5c1-5608-4bf0-8bfa-c056084f0bc0","Type":"ContainerStarted","Data":"9b85bfe6b1bb8263cd7c1ef1c158b9b5e0eaf1897bf376396954c00fb954d798"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.626373 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rtv7m" event={"ID":"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7","Type":"ContainerStarted","Data":"a0d1cc851ac4fef8079cd6a4cbe2fa833ca9d356b7819480728763af3f223d22"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.637430 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" event={"ID":"feade5e7-72de-4496-a324-967c6727f8d7","Type":"ContainerDied","Data":"6003bd986245257103abe24be4f7293f43240cb3772fe66336bd336ef6c6d809"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.637479 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qw8r8" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.641903 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" event={"ID":"26df7306-345d-425d-8f8f-bd026582c25f","Type":"ContainerDied","Data":"a058889e48c466d3fb241efee3f839b018d2c76c5e603c52790e57deceb03def"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.642086 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-c7dhq" Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.652369 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"867ad59f-d6b4-42da-90e8-9ac943b12aaf","Type":"ContainerStarted","Data":"ebc257d9c63b5cdc92c2d0f49aaedc53d6f4d300a7aadb7e5ab5d7d63e8e1c91"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.659487 4901 generic.go:334] "Generic (PLEG): container finished" podID="de467848-25a5-4ed7-b072-e11de1d42561" containerID="afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c" exitCode=0 Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.659545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerDied","Data":"afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.659599 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerStarted","Data":"6c32f46b7040f93c8c7867be3aec3f55f94dbdaaef75d5fbac1f0d3eab6aa011"} Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.749943 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.755922 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qw8r8"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.804304 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:32 crc kubenswrapper[4901]: I0202 10:54:32.824786 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-c7dhq"] Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.357574 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-ckvwh"] Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.423629 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:33 crc kubenswrapper[4901]: W0202 10:54:33.529220 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d81b635_e8be_4199_827b_02ea68a3de3b.slice/crio-cb7e3bd2546767c28668d44d8a5506743c425893aadc7df03c9df47dbfe8635f WatchSource:0}: Error finding container cb7e3bd2546767c28668d44d8a5506743c425893aadc7df03c9df47dbfe8635f: Status 404 returned error can't find the container with id cb7e3bd2546767c28668d44d8a5506743c425893aadc7df03c9df47dbfe8635f Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.671382 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ckvwh" event={"ID":"8984971d-92d8-4bf2-b07f-f8af49f67ece","Type":"ContainerStarted","Data":"83d2bbd4b64f2dfea791578d29c9a9bfb9385e1b4f9f827068d8d6fa021f7594"} Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.674940 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5d81b635-e8be-4199-827b-02ea68a3de3b","Type":"ContainerStarted","Data":"cb7e3bd2546767c28668d44d8a5506743c425893aadc7df03c9df47dbfe8635f"} Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.701841 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26df7306-345d-425d-8f8f-bd026582c25f" path="/var/lib/kubelet/pods/26df7306-345d-425d-8f8f-bd026582c25f/volumes" Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.702387 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feade5e7-72de-4496-a324-967c6727f8d7" path="/var/lib/kubelet/pods/feade5e7-72de-4496-a324-967c6727f8d7/volumes" Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.702978 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c56659df-71f1-4dbb-819f-b71277070b0e","Type":"ContainerStarted","Data":"e0c468f897e7500e36cdb9e24e055127baf2122f5be9edb772ea540d55904ad6"} Feb 02 10:54:33 crc kubenswrapper[4901]: I0202 10:54:33.703034 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerStarted","Data":"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab"} Feb 02 10:54:36 crc kubenswrapper[4901]: I0202 10:54:36.714268 4901 generic.go:334] "Generic (PLEG): container finished" podID="c56659df-71f1-4dbb-819f-b71277070b0e" containerID="e0c468f897e7500e36cdb9e24e055127baf2122f5be9edb772ea540d55904ad6" exitCode=0 Feb 02 10:54:36 crc kubenswrapper[4901]: I0202 10:54:36.714328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c56659df-71f1-4dbb-819f-b71277070b0e","Type":"ContainerDied","Data":"e0c468f897e7500e36cdb9e24e055127baf2122f5be9edb772ea540d55904ad6"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.723817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerStarted","Data":"44319e661ca4e5ae8790f27189efdb3e72aaf71087648ce53b30d9e8aabd905b"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.727469 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"867ad59f-d6b4-42da-90e8-9ac943b12aaf","Type":"ContainerStarted","Data":"3c4430a6f51d008d77a302a462478bcdb46d6aca37c6f5f538ae288f9da9d318"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.732984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e251c5c1-5608-4bf0-8bfa-c056084f0bc0","Type":"ContainerStarted","Data":"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.733083 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.739025 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c56659df-71f1-4dbb-819f-b71277070b0e","Type":"ContainerStarted","Data":"0bd4cf4464cdbacd78c4c4a53507959913301ac68dd4e19ebcd163dd5b1ca9d6"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.741259 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e7551d06-91c3-4652-b042-cf8080c36ce2","Type":"ContainerStarted","Data":"bf689261a37b7400392cdb8b4a65ef8722235ee399aed5a57924fd2bc764206f"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.741522 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.749900 4901 generic.go:334] "Generic (PLEG): container finished" podID="de467848-25a5-4ed7-b072-e11de1d42561" containerID="6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711" exitCode=0 Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.750238 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerDied","Data":"6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.754635 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5d81b635-e8be-4199-827b-02ea68a3de3b","Type":"ContainerStarted","Data":"88d696b09170f26ed03568cf071142670b0fe649617bd7eac3d83f0375d109db"} Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.775079 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.649212568 podStartE2EDuration="28.7750528s" podCreationTimestamp="2026-02-02 10:54:09 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.217297704 +0000 UTC m=+959.235637800" lastFinishedPulling="2026-02-02 10:54:37.343137916 +0000 UTC m=+964.361478032" observedRunningTime="2026-02-02 10:54:37.773224445 +0000 UTC m=+964.791564541" watchObservedRunningTime="2026-02-02 10:54:37.7750528 +0000 UTC m=+964.793392896" Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.828886 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.536447331 podStartE2EDuration="30.828842385s" podCreationTimestamp="2026-02-02 10:54:07 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.217211522 +0000 UTC m=+959.235551618" lastFinishedPulling="2026-02-02 10:54:36.509606576 +0000 UTC m=+963.527946672" observedRunningTime="2026-02-02 10:54:37.788331673 +0000 UTC m=+964.806671779" watchObservedRunningTime="2026-02-02 10:54:37.828842385 +0000 UTC m=+964.847182471" Feb 02 10:54:37 crc kubenswrapper[4901]: I0202 10:54:37.835110 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=31.426524281 podStartE2EDuration="31.835094801s" podCreationTimestamp="2026-02-02 10:54:06 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.089507331 +0000 UTC m=+959.107847437" lastFinishedPulling="2026-02-02 10:54:32.498077861 +0000 UTC m=+959.516417957" observedRunningTime="2026-02-02 10:54:37.808119007 +0000 UTC m=+964.826459103" watchObservedRunningTime="2026-02-02 10:54:37.835094801 +0000 UTC m=+964.853434887" Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.768438 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerStarted","Data":"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8"} Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.771336 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerStarted","Data":"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2"} Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.774733 4901 generic.go:334] "Generic (PLEG): container finished" podID="8984971d-92d8-4bf2-b07f-f8af49f67ece" containerID="1f7ce951d12f2954179ab828d13dd6528489f90d6eab4a507910343b728f43dd" exitCode=0 Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.774810 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ckvwh" event={"ID":"8984971d-92d8-4bf2-b07f-f8af49f67ece","Type":"ContainerDied","Data":"1f7ce951d12f2954179ab828d13dd6528489f90d6eab4a507910343b728f43dd"} Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.777374 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rtv7m" event={"ID":"bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7","Type":"ContainerStarted","Data":"b5b268704b13f5119786f05cff5d50cddd238b9f6fd1a5aa4c4e93c935ea4077"} Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.778464 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rtv7m" Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.780493 4901 generic.go:334] "Generic (PLEG): container finished" podID="715a873e-0260-4254-86a6-203a2e08e36d" containerID="44319e661ca4e5ae8790f27189efdb3e72aaf71087648ce53b30d9e8aabd905b" exitCode=0 Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.781694 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerDied","Data":"44319e661ca4e5ae8790f27189efdb3e72aaf71087648ce53b30d9e8aabd905b"} Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.788947 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdrnz" podStartSLOduration=24.187768727 podStartE2EDuration="29.788928498s" podCreationTimestamp="2026-02-02 10:54:09 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.663504666 +0000 UTC m=+959.681844752" lastFinishedPulling="2026-02-02 10:54:38.264664427 +0000 UTC m=+965.283004523" observedRunningTime="2026-02-02 10:54:38.786452597 +0000 UTC m=+965.804792693" watchObservedRunningTime="2026-02-02 10:54:38.788928498 +0000 UTC m=+965.807268594" Feb 02 10:54:38 crc kubenswrapper[4901]: I0202 10:54:38.807986 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rtv7m" podStartSLOduration=21.804203923 podStartE2EDuration="26.807961704s" podCreationTimestamp="2026-02-02 10:54:12 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.238061473 +0000 UTC m=+959.256401579" lastFinishedPulling="2026-02-02 10:54:37.241819264 +0000 UTC m=+964.260159360" observedRunningTime="2026-02-02 10:54:38.804087317 +0000 UTC m=+965.822427423" watchObservedRunningTime="2026-02-02 10:54:38.807961704 +0000 UTC m=+965.826301820" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.571139 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.571548 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.790915 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ckvwh" event={"ID":"8984971d-92d8-4bf2-b07f-f8af49f67ece","Type":"ContainerStarted","Data":"2d1883850cb75ae8d5d9d9c69012e9a93f5eb46a232290734044eb40f1726f8a"} Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.791415 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.791438 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.791450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-ckvwh" event={"ID":"8984971d-92d8-4bf2-b07f-f8af49f67ece","Type":"ContainerStarted","Data":"3a0eab0cc4cc87871c4f18abf0b51ced9427c82e397d16b55333f3f2e1e2fb9f"} Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.792514 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5d81b635-e8be-4199-827b-02ea68a3de3b","Type":"ContainerStarted","Data":"b09fd1a880b07edd4a1b0e51b12ebc64d454db5780fc00855abe37831bd77c6e"} Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.795193 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerStarted","Data":"42b825a2fdb927d280c3ca2c5589711db380c64abea2cb7b58c3cb2a1dfbbe35"} Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.797452 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"867ad59f-d6b4-42da-90e8-9ac943b12aaf","Type":"ContainerStarted","Data":"256dc0424bad2760235141d4e3072c4e0e507fce0bc2f53db47dac74944b8893"} Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.821168 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-ckvwh" podStartSLOduration=23.55102082 podStartE2EDuration="26.821143966s" podCreationTimestamp="2026-02-02 10:54:13 +0000 UTC" firstStartedPulling="2026-02-02 10:54:33.528486082 +0000 UTC m=+960.546826178" lastFinishedPulling="2026-02-02 10:54:36.798609228 +0000 UTC m=+963.816949324" observedRunningTime="2026-02-02 10:54:39.815236957 +0000 UTC m=+966.833577093" watchObservedRunningTime="2026-02-02 10:54:39.821143966 +0000 UTC m=+966.839484072" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.846321 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.16818952 podStartE2EDuration="27.846300504s" podCreationTimestamp="2026-02-02 10:54:12 +0000 UTC" firstStartedPulling="2026-02-02 10:54:33.540015391 +0000 UTC m=+960.558355487" lastFinishedPulling="2026-02-02 10:54:39.218126375 +0000 UTC m=+966.236466471" observedRunningTime="2026-02-02 10:54:39.840755456 +0000 UTC m=+966.859095552" watchObservedRunningTime="2026-02-02 10:54:39.846300504 +0000 UTC m=+966.864640610" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.869257 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.977996615 podStartE2EDuration="23.869236227s" podCreationTimestamp="2026-02-02 10:54:16 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.319356745 +0000 UTC m=+959.337696841" lastFinishedPulling="2026-02-02 10:54:39.210596337 +0000 UTC m=+966.228936453" observedRunningTime="2026-02-02 10:54:39.863033853 +0000 UTC m=+966.881373959" watchObservedRunningTime="2026-02-02 10:54:39.869236227 +0000 UTC m=+966.887576323" Feb 02 10:54:39 crc kubenswrapper[4901]: I0202 10:54:39.885402 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5mfk4" podStartSLOduration=20.889110813 podStartE2EDuration="27.885384021s" podCreationTimestamp="2026-02-02 10:54:12 +0000 UTC" firstStartedPulling="2026-02-02 10:54:32.591771432 +0000 UTC m=+959.610111558" lastFinishedPulling="2026-02-02 10:54:39.58804466 +0000 UTC m=+966.606384766" observedRunningTime="2026-02-02 10:54:39.881243098 +0000 UTC m=+966.899583194" watchObservedRunningTime="2026-02-02 10:54:39.885384021 +0000 UTC m=+966.903724117" Feb 02 10:54:40 crc kubenswrapper[4901]: I0202 10:54:40.622164 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bdrnz" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="registry-server" probeResult="failure" output=< Feb 02 10:54:40 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 10:54:40 crc kubenswrapper[4901]: > Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.157377 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.202262 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.692921 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.733364 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.829222 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:41 crc kubenswrapper[4901]: I0202 10:54:41.829614 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:42 crc kubenswrapper[4901]: I0202 10:54:42.748078 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:42 crc kubenswrapper[4901]: I0202 10:54:42.756857 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:42 crc kubenswrapper[4901]: I0202 10:54:42.756985 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:42 crc kubenswrapper[4901]: I0202 10:54:42.896045 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 02 10:54:42 crc kubenswrapper[4901]: I0202 10:54:42.901098 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.113103 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.157471 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.164308 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.173804 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.174543 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.250634 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.250740 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.250782 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4762\" (UniqueName: \"kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.250836 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.265453 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nt5zf"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.266392 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.272991 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.295205 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nt5zf"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352154 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovs-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352222 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-config\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352299 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgmn9\" (UniqueName: \"kubernetes.io/projected/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-kube-api-access-lgmn9\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovn-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352354 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4762\" (UniqueName: \"kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352416 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-combined-ca-bundle\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352440 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.352469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.353370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.354107 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.355188 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.416917 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4762\" (UniqueName: \"kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762\") pod \"dnsmasq-dns-6bc7876d45-zchj8\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.422645 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458230 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-config\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458272 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgmn9\" (UniqueName: \"kubernetes.io/projected/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-kube-api-access-lgmn9\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovn-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458355 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-combined-ca-bundle\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458390 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458421 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovs-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.458723 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovs-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.459365 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-config\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.462622 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-ovn-rundir\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.467160 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-combined-ca-bundle\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.476816 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.478486 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.486107 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.490684 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.498049 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.507187 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.517154 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.523518 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.523888 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.524054 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r48qg" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.524232 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.524781 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgmn9\" (UniqueName: \"kubernetes.io/projected/0c692cfe-56e1-4bba-b3e7-bf1d713d343d-kube-api-access-lgmn9\") pod \"ovn-controller-metrics-nt5zf\" (UID: \"0c692cfe-56e1-4bba-b3e7-bf1d713d343d\") " pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.536354 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.545416 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.559813 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.559845 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.559873 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-config\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.559975 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-scripts\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560226 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560456 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndt8\" (UniqueName: \"kubernetes.io/projected/52d82257-56bf-48c9-b8e1-8cdab6278855-kube-api-access-6ndt8\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560710 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560795 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xrz\" (UniqueName: \"kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.560947 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.613729 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nt5zf" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.666496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668371 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xrz\" (UniqueName: \"kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668453 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668590 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668621 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668683 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-config\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668708 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-scripts\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668758 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668776 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.668934 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndt8\" (UniqueName: \"kubernetes.io/projected/52d82257-56bf-48c9-b8e1-8cdab6278855-kube-api-access-6ndt8\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.669026 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.669092 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.670672 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.671606 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-scripts\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.671645 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.672323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.673318 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.674324 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.678929 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.681384 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.681698 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52d82257-56bf-48c9-b8e1-8cdab6278855-config\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.696260 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/52d82257-56bf-48c9-b8e1-8cdab6278855-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.725724 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndt8\" (UniqueName: \"kubernetes.io/projected/52d82257-56bf-48c9-b8e1-8cdab6278855-kube-api-access-6ndt8\") pod \"ovn-northd-0\" (UID: \"52d82257-56bf-48c9-b8e1-8cdab6278855\") " pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.728015 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xrz\" (UniqueName: \"kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz\") pod \"dnsmasq-dns-8554648995-twwlw\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.775755 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.854233 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5mfk4" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="registry-server" probeResult="failure" output=< Feb 02 10:54:43 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 10:54:43 crc kubenswrapper[4901]: > Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.867909 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.868310 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-f9hjt" event={"ID":"984f7ae8-8c3c-41ec-85a4-dd0b31764800","Type":"ContainerDied","Data":"3eb28519d6186cbcf52a3bf4c06c2f236c704456af8edd924a88007a3c138d0f"} Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.879428 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd4qc\" (UniqueName: \"kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc\") pod \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.879506 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc\") pod \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.879624 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config\") pod \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\" (UID: \"984f7ae8-8c3c-41ec-85a4-dd0b31764800\") " Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.880760 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config" (OuterVolumeSpecName: "config") pod "984f7ae8-8c3c-41ec-85a4-dd0b31764800" (UID: "984f7ae8-8c3c-41ec-85a4-dd0b31764800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.881054 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "984f7ae8-8c3c-41ec-85a4-dd0b31764800" (UID: "984f7ae8-8c3c-41ec-85a4-dd0b31764800"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.921915 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc" (OuterVolumeSpecName: "kube-api-access-bd4qc") pod "984f7ae8-8c3c-41ec-85a4-dd0b31764800" (UID: "984f7ae8-8c3c-41ec-85a4-dd0b31764800"). InnerVolumeSpecName "kube-api-access-bd4qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.939775 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.948149 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.986156 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.986194 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd4qc\" (UniqueName: \"kubernetes.io/projected/984f7ae8-8c3c-41ec-85a4-dd0b31764800-kube-api-access-bd4qc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:43 crc kubenswrapper[4901]: I0202 10:54:43.986204 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/984f7ae8-8c3c-41ec-85a4-dd0b31764800-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.113443 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.188487 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config\") pod \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.188919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wpkm\" (UniqueName: \"kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm\") pod \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.189030 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc\") pod \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\" (UID: \"9d5bfda5-826b-4d0d-81a6-87d5589f0c78\") " Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.191233 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config" (OuterVolumeSpecName: "config") pod "9d5bfda5-826b-4d0d-81a6-87d5589f0c78" (UID: "9d5bfda5-826b-4d0d-81a6-87d5589f0c78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.191383 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d5bfda5-826b-4d0d-81a6-87d5589f0c78" (UID: "9d5bfda5-826b-4d0d-81a6-87d5589f0c78"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.197642 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.210290 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm" (OuterVolumeSpecName: "kube-api-access-2wpkm") pod "9d5bfda5-826b-4d0d-81a6-87d5589f0c78" (UID: "9d5bfda5-826b-4d0d-81a6-87d5589f0c78"). InnerVolumeSpecName "kube-api-access-2wpkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:44 crc kubenswrapper[4901]: W0202 10:54:44.251280 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00bd3d1b_06ef_4c4f_b2cd_22bc427ea4a3.slice/crio-1cad8a723db83574efbae504f3d0087b5332f958856f59893dbf747df250a744 WatchSource:0}: Error finding container 1cad8a723db83574efbae504f3d0087b5332f958856f59893dbf747df250a744: Status 404 returned error can't find the container with id 1cad8a723db83574efbae504f3d0087b5332f958856f59893dbf747df250a744 Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.290772 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.290804 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wpkm\" (UniqueName: \"kubernetes.io/projected/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-kube-api-access-2wpkm\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.290816 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d5bfda5-826b-4d0d-81a6-87d5589f0c78-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.297405 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nt5zf"] Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.305085 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.314021 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-f9hjt"] Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.579726 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:54:44 crc kubenswrapper[4901]: W0202 10:54:44.596349 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d5168b9_9cea_4a4c_9a44_7fd4779bf302.slice/crio-b2fcb43428d3c789d2f9a8189a0d1e41b067a3783779a075451b2e161cfe6431 WatchSource:0}: Error finding container b2fcb43428d3c789d2f9a8189a0d1e41b067a3783779a075451b2e161cfe6431: Status 404 returned error can't find the container with id b2fcb43428d3c789d2f9a8189a0d1e41b067a3783779a075451b2e161cfe6431 Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.659997 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.890648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-twwlw" event={"ID":"5d5168b9-9cea-4a4c-9a44-7fd4779bf302","Type":"ContainerStarted","Data":"b2fcb43428d3c789d2f9a8189a0d1e41b067a3783779a075451b2e161cfe6431"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.892918 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" event={"ID":"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3","Type":"ContainerStarted","Data":"1cad8a723db83574efbae504f3d0087b5332f958856f59893dbf747df250a744"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.903522 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nt5zf" event={"ID":"0c692cfe-56e1-4bba-b3e7-bf1d713d343d","Type":"ContainerStarted","Data":"ca10de51d242280a05efcb6c4c0dc11e39bc039d5925576978b736144cb013fb"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.903984 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nt5zf" event={"ID":"0c692cfe-56e1-4bba-b3e7-bf1d713d343d","Type":"ContainerStarted","Data":"a30f629af05ff58d34d5fce9911ebcfda710155a7378d14fbeaf49646c9ddec4"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.911999 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"52d82257-56bf-48c9-b8e1-8cdab6278855","Type":"ContainerStarted","Data":"38bfe53b8729efffe2850a99a95eb49d6b92807ec3f673530713062b26a6a52a"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.918512 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" event={"ID":"9d5bfda5-826b-4d0d-81a6-87d5589f0c78","Type":"ContainerDied","Data":"a7cafc663f56fa467669b1758494d5e8ed7c6a706f6550b4c150cbe483c9a56e"} Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.918845 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6v2gd" Feb 02 10:54:44 crc kubenswrapper[4901]: I0202 10:54:44.978553 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nt5zf" podStartSLOduration=1.978523376 podStartE2EDuration="1.978523376s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:44.929393588 +0000 UTC m=+971.947733684" watchObservedRunningTime="2026-02-02 10:54:44.978523376 +0000 UTC m=+971.996863472" Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.056665 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.068255 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6v2gd"] Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.691385 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984f7ae8-8c3c-41ec-85a4-dd0b31764800" path="/var/lib/kubelet/pods/984f7ae8-8c3c-41ec-85a4-dd0b31764800/volumes" Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.692177 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5bfda5-826b-4d0d-81a6-87d5589f0c78" path="/var/lib/kubelet/pods/9d5bfda5-826b-4d0d-81a6-87d5589f0c78/volumes" Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.935132 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93ecc7a4-4c23-488f-8d75-8fee0246afe4","Type":"ContainerStarted","Data":"b0d38d118609714056ca9d27dd89f6fe2273395e6fdd306f15b32f895e1563d0"} Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.944941 4901 generic.go:334] "Generic (PLEG): container finished" podID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerID="009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82" exitCode=0 Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.945048 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-twwlw" event={"ID":"5d5168b9-9cea-4a4c-9a44-7fd4779bf302","Type":"ContainerDied","Data":"009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82"} Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.948791 4901 generic.go:334] "Generic (PLEG): container finished" podID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerID="d7fc37ebf0750188e16200d3754e8effa4ae59d98e62b468b328fc178f340bca" exitCode=0 Feb 02 10:54:45 crc kubenswrapper[4901]: I0202 10:54:45.948966 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" event={"ID":"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3","Type":"ContainerDied","Data":"d7fc37ebf0750188e16200d3754e8effa4ae59d98e62b468b328fc178f340bca"} Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.958305 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" event={"ID":"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3","Type":"ContainerStarted","Data":"8c06ab667935b4c350eaca3d0ce491b06404b22305fc49b84a6186e7e8803bcf"} Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.958803 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.960372 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"52d82257-56bf-48c9-b8e1-8cdab6278855","Type":"ContainerStarted","Data":"07f720fcb2625bab9b74b0702b135dfe9c2260283e9465cf6fd0bdb70c8bf8a0"} Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.960418 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"52d82257-56bf-48c9-b8e1-8cdab6278855","Type":"ContainerStarted","Data":"dd3a14c2c4f9b23e62d47113ef0228e72c79e524e7e7e6a53f0d9d268958ad35"} Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.960489 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.963311 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-twwlw" event={"ID":"5d5168b9-9cea-4a4c-9a44-7fd4779bf302","Type":"ContainerStarted","Data":"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e"} Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.963418 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:46 crc kubenswrapper[4901]: I0202 10:54:46.983067 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" podStartSLOduration=3.499834875 podStartE2EDuration="3.983039371s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="2026-02-02 10:54:44.255848455 +0000 UTC m=+971.274188551" lastFinishedPulling="2026-02-02 10:54:44.739052951 +0000 UTC m=+971.757393047" observedRunningTime="2026-02-02 10:54:46.977442411 +0000 UTC m=+973.995782507" watchObservedRunningTime="2026-02-02 10:54:46.983039371 +0000 UTC m=+974.001379467" Feb 02 10:54:47 crc kubenswrapper[4901]: I0202 10:54:47.003842 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-twwlw" podStartSLOduration=3.560764179 podStartE2EDuration="4.003820951s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="2026-02-02 10:54:44.598786666 +0000 UTC m=+971.617126762" lastFinishedPulling="2026-02-02 10:54:45.041843438 +0000 UTC m=+972.060183534" observedRunningTime="2026-02-02 10:54:46.999170944 +0000 UTC m=+974.017511060" watchObservedRunningTime="2026-02-02 10:54:47.003820951 +0000 UTC m=+974.022161047" Feb 02 10:54:47 crc kubenswrapper[4901]: I0202 10:54:47.025647 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.424808528 podStartE2EDuration="4.025626595s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="2026-02-02 10:54:44.663976024 +0000 UTC m=+971.682316120" lastFinishedPulling="2026-02-02 10:54:46.264794071 +0000 UTC m=+973.283134187" observedRunningTime="2026-02-02 10:54:47.016441266 +0000 UTC m=+974.034781382" watchObservedRunningTime="2026-02-02 10:54:47.025626595 +0000 UTC m=+974.043966681" Feb 02 10:54:47 crc kubenswrapper[4901]: I0202 10:54:47.534516 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:47 crc kubenswrapper[4901]: I0202 10:54:47.534617 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:47 crc kubenswrapper[4901]: I0202 10:54:47.624363 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:48 crc kubenswrapper[4901]: I0202 10:54:48.073676 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.557241 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.557606 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="dnsmasq-dns" containerID="cri-o://8c06ab667935b4c350eaca3d0ce491b06404b22305fc49b84a6186e7e8803bcf" gracePeriod=10 Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.600176 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.603864 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.605390 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.617368 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.641733 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.711212 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.711260 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.711289 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc5rj\" (UniqueName: \"kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.711401 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.711454 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.750532 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.813079 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814096 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814133 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814231 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814250 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc5rj\" (UniqueName: \"kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814251 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814877 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.814935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.835476 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc5rj\" (UniqueName: \"kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj\") pod \"dnsmasq-dns-b8fbc5445-vj5hj\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:49 crc kubenswrapper[4901]: I0202 10:54:49.889372 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.008171 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.012686 4901 generic.go:334] "Generic (PLEG): container finished" podID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerID="8c06ab667935b4c350eaca3d0ce491b06404b22305fc49b84a6186e7e8803bcf" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.012728 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" event={"ID":"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3","Type":"ContainerDied","Data":"8c06ab667935b4c350eaca3d0ce491b06404b22305fc49b84a6186e7e8803bcf"} Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.141153 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.222144 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc\") pod \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.222582 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb\") pod \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.222603 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4762\" (UniqueName: \"kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762\") pod \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.222633 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config\") pod \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\" (UID: \"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3\") " Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.243506 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762" (OuterVolumeSpecName: "kube-api-access-l4762") pod "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" (UID: "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3"). InnerVolumeSpecName "kube-api-access-l4762". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.324307 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4762\" (UniqueName: \"kubernetes.io/projected/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-kube-api-access-l4762\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.338709 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config" (OuterVolumeSpecName: "config") pod "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" (UID: "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.347062 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" (UID: "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.354584 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" (UID: "00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.428639 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.428683 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.428700 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.431195 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.771153 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:54:50 crc kubenswrapper[4901]: E0202 10:54:50.771687 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="dnsmasq-dns" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.771705 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="dnsmasq-dns" Feb 02 10:54:50 crc kubenswrapper[4901]: E0202 10:54:50.771738 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="init" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.771745 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="init" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.771958 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" containerName="dnsmasq-dns" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.777035 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.782375 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.782784 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.782931 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6zdfk" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.782969 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.827390 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836174 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836256 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-lock\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836324 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-cache\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836415 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4d5a91-d330-499c-9123-35b58d8c55d5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.836463 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl66n\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-kube-api-access-vl66n\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938038 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4d5a91-d330-499c-9123-35b58d8c55d5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938120 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl66n\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-kube-api-access-vl66n\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938180 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938203 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-lock\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938247 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938268 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-cache\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.938862 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-cache\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.940012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4b4d5a91-d330-499c-9123-35b58d8c55d5-lock\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.940119 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: E0202 10:54:50.940582 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:54:50 crc kubenswrapper[4901]: E0202 10:54:50.940615 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:54:50 crc kubenswrapper[4901]: E0202 10:54:50.940687 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:51.440662487 +0000 UTC m=+978.459002583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.966316 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b4d5a91-d330-499c-9123-35b58d8c55d5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.970520 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl66n\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-kube-api-access-vl66n\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:50 crc kubenswrapper[4901]: I0202 10:54:50.980002 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.031040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" event={"ID":"00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3","Type":"ContainerDied","Data":"1cad8a723db83574efbae504f3d0087b5332f958856f59893dbf747df250a744"} Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.031102 4901 scope.go:117] "RemoveContainer" containerID="8c06ab667935b4c350eaca3d0ce491b06404b22305fc49b84a6186e7e8803bcf" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.031405 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-zchj8" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.034788 4901 generic.go:334] "Generic (PLEG): container finished" podID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerID="cd3cb5a398323c5815b4d0172c885d14a29c70c4af4229133ef5b8f24cb36439" exitCode=0 Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.034840 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" event={"ID":"a32397c5-9ffc-4b59-abac-4376cfb81d4a","Type":"ContainerDied","Data":"cd3cb5a398323c5815b4d0172c885d14a29c70c4af4229133ef5b8f24cb36439"} Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.034860 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" event={"ID":"a32397c5-9ffc-4b59-abac-4376cfb81d4a","Type":"ContainerStarted","Data":"ddcdbae9c5acbd92bd22dc005aa5f62b4b6b8a8c0cac3a7912317f50176b4579"} Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.041689 4901 generic.go:334] "Generic (PLEG): container finished" podID="93ecc7a4-4c23-488f-8d75-8fee0246afe4" containerID="b0d38d118609714056ca9d27dd89f6fe2273395e6fdd306f15b32f895e1563d0" exitCode=0 Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.041757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93ecc7a4-4c23-488f-8d75-8fee0246afe4","Type":"ContainerDied","Data":"b0d38d118609714056ca9d27dd89f6fe2273395e6fdd306f15b32f895e1563d0"} Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.041934 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bdrnz" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="registry-server" containerID="cri-o://d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8" gracePeriod=2 Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.054845 4901 scope.go:117] "RemoveContainer" containerID="d7fc37ebf0750188e16200d3754e8effa4ae59d98e62b468b328fc178f340bca" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.268836 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.297634 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-zchj8"] Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.459858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:51 crc kubenswrapper[4901]: E0202 10:54:51.460262 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:54:51 crc kubenswrapper[4901]: E0202 10:54:51.460616 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:54:51 crc kubenswrapper[4901]: E0202 10:54:51.460698 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:52.460674863 +0000 UTC m=+979.479014959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.566372 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.663713 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content\") pod \"de467848-25a5-4ed7-b072-e11de1d42561\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.663790 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities\") pod \"de467848-25a5-4ed7-b072-e11de1d42561\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.664020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9t4h\" (UniqueName: \"kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h\") pod \"de467848-25a5-4ed7-b072-e11de1d42561\" (UID: \"de467848-25a5-4ed7-b072-e11de1d42561\") " Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.665642 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities" (OuterVolumeSpecName: "utilities") pod "de467848-25a5-4ed7-b072-e11de1d42561" (UID: "de467848-25a5-4ed7-b072-e11de1d42561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.668780 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h" (OuterVolumeSpecName: "kube-api-access-h9t4h") pod "de467848-25a5-4ed7-b072-e11de1d42561" (UID: "de467848-25a5-4ed7-b072-e11de1d42561"). InnerVolumeSpecName "kube-api-access-h9t4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.689041 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3" path="/var/lib/kubelet/pods/00bd3d1b-06ef-4c4f-b2cd-22bc427ea4a3/volumes" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.697210 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de467848-25a5-4ed7-b072-e11de1d42561" (UID: "de467848-25a5-4ed7-b072-e11de1d42561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.765426 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.765464 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de467848-25a5-4ed7-b072-e11de1d42561-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4901]: I0202 10:54:51.765479 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9t4h\" (UniqueName: \"kubernetes.io/projected/de467848-25a5-4ed7-b072-e11de1d42561-kube-api-access-h9t4h\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.059831 4901 generic.go:334] "Generic (PLEG): container finished" podID="de467848-25a5-4ed7-b072-e11de1d42561" containerID="d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8" exitCode=0 Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.059935 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdrnz" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.059967 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerDied","Data":"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8"} Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.061977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdrnz" event={"ID":"de467848-25a5-4ed7-b072-e11de1d42561","Type":"ContainerDied","Data":"6c32f46b7040f93c8c7867be3aec3f55f94dbdaaef75d5fbac1f0d3eab6aa011"} Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.062041 4901 scope.go:117] "RemoveContainer" containerID="d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.072759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" event={"ID":"a32397c5-9ffc-4b59-abac-4376cfb81d4a","Type":"ContainerStarted","Data":"8e4489d032d6dbc762715b387bdf17e79591484090506cc386c55a253d4d7444"} Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.072910 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.078765 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93ecc7a4-4c23-488f-8d75-8fee0246afe4","Type":"ContainerStarted","Data":"bdb9cbcaf6116899c9a2f509e79b8eadcb2c5906d0af7a7f084065c3ef1b2755"} Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.105911 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podStartSLOduration=3.105888278 podStartE2EDuration="3.105888278s" podCreationTimestamp="2026-02-02 10:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:52.102100013 +0000 UTC m=+979.120440149" watchObservedRunningTime="2026-02-02 10:54:52.105888278 +0000 UTC m=+979.124228374" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.124801 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371988.730001 podStartE2EDuration="48.124773479s" podCreationTimestamp="2026-02-02 10:54:04 +0000 UTC" firstStartedPulling="2026-02-02 10:54:06.886542142 +0000 UTC m=+933.904882238" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:52.119490967 +0000 UTC m=+979.137831063" watchObservedRunningTime="2026-02-02 10:54:52.124773479 +0000 UTC m=+979.143113565" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.155612 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.160929 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdrnz"] Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.477404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:52 crc kubenswrapper[4901]: E0202 10:54:52.477658 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:54:52 crc kubenswrapper[4901]: E0202 10:54:52.477690 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:54:52 crc kubenswrapper[4901]: E0202 10:54:52.477761 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:54.477739391 +0000 UTC m=+981.496079487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.822253 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:52 crc kubenswrapper[4901]: I0202 10:54:52.878224 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:53 crc kubenswrapper[4901]: I0202 10:54:53.689301 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de467848-25a5-4ed7-b072-e11de1d42561" path="/var/lib/kubelet/pods/de467848-25a5-4ed7-b072-e11de1d42561/volumes" Feb 02 10:54:53 crc kubenswrapper[4901]: I0202 10:54:53.942839 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.004449 4901 scope.go:117] "RemoveContainer" containerID="6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.048368 4901 scope.go:117] "RemoveContainer" containerID="afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.091956 4901 scope.go:117] "RemoveContainer" containerID="d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.092854 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8\": container with ID starting with d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8 not found: ID does not exist" containerID="d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.092885 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8"} err="failed to get container status \"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8\": rpc error: code = NotFound desc = could not find container \"d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8\": container with ID starting with d98e983887b15fdd1bcaa596956cbc381eb56e53d9dbbbaeffb5fc578ed2f2e8 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.092905 4901 scope.go:117] "RemoveContainer" containerID="6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.093319 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711\": container with ID starting with 6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711 not found: ID does not exist" containerID="6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.093338 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711"} err="failed to get container status \"6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711\": rpc error: code = NotFound desc = could not find container \"6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711\": container with ID starting with 6af36ab9f589e4d8b6f00a9eac87f4e5c30cd53cc3d9a7eeeaf7f8ff91359711 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.093351 4901 scope.go:117] "RemoveContainer" containerID="afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.093879 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c\": container with ID starting with afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c not found: ID does not exist" containerID="afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.093899 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c"} err="failed to get container status \"afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c\": rpc error: code = NotFound desc = could not find container \"afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c\": container with ID starting with afa28b34028940dceb40e7ea298a8cfe9baf2744c49f7c430acfea296305723c not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.289909 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.290262 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5mfk4" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="registry-server" containerID="cri-o://42b825a2fdb927d280c3ca2c5589711db380c64abea2cb7b58c3cb2a1dfbbe35" gracePeriod=2 Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.516099 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.516284 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.516308 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.516364 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:58.516349438 +0000 UTC m=+985.534689534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.658888 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tmj6n"] Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.660001 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="extract-content" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.660027 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="extract-content" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.660061 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="extract-utilities" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.660073 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="extract-utilities" Feb 02 10:54:54 crc kubenswrapper[4901]: E0202 10:54:54.660099 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="registry-server" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.660109 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="registry-server" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.660330 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="de467848-25a5-4ed7-b072-e11de1d42561" containerName="registry-server" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.661163 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.665013 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.665050 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.669611 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.685475 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tmj6n"] Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721395 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721794 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721848 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721886 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721920 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpj7f\" (UniqueName: \"kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.721994 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.722023 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.823945 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824035 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824073 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpj7f\" (UniqueName: \"kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824118 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824151 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824256 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824336 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.824988 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.825960 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.826203 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.831643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.832490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.837755 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.846207 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpj7f\" (UniqueName: \"kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f\") pod \"swift-ring-rebalance-tmj6n\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:54 crc kubenswrapper[4901]: I0202 10:54:54.978791 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.132628 4901 generic.go:334] "Generic (PLEG): container finished" podID="715a873e-0260-4254-86a6-203a2e08e36d" containerID="42b825a2fdb927d280c3ca2c5589711db380c64abea2cb7b58c3cb2a1dfbbe35" exitCode=0 Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.133024 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerDied","Data":"42b825a2fdb927d280c3ca2c5589711db380c64abea2cb7b58c3cb2a1dfbbe35"} Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.514209 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tmj6n"] Feb 02 10:54:55 crc kubenswrapper[4901]: W0202 10:54:55.520304 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882c42d4_7650_4f0d_8973_ba9ddcbb6800.slice/crio-9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676 WatchSource:0}: Error finding container 9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676: Status 404 returned error can't find the container with id 9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676 Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.925237 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.960440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities\") pod \"715a873e-0260-4254-86a6-203a2e08e36d\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.960536 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content\") pod \"715a873e-0260-4254-86a6-203a2e08e36d\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.960623 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-247cv\" (UniqueName: \"kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv\") pod \"715a873e-0260-4254-86a6-203a2e08e36d\" (UID: \"715a873e-0260-4254-86a6-203a2e08e36d\") " Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.961536 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities" (OuterVolumeSpecName: "utilities") pod "715a873e-0260-4254-86a6-203a2e08e36d" (UID: "715a873e-0260-4254-86a6-203a2e08e36d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:55 crc kubenswrapper[4901]: I0202 10:54:55.968316 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv" (OuterVolumeSpecName: "kube-api-access-247cv") pod "715a873e-0260-4254-86a6-203a2e08e36d" (UID: "715a873e-0260-4254-86a6-203a2e08e36d"). InnerVolumeSpecName "kube-api-access-247cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.062806 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-247cv\" (UniqueName: \"kubernetes.io/projected/715a873e-0260-4254-86a6-203a2e08e36d-kube-api-access-247cv\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.062839 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.091498 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715a873e-0260-4254-86a6-203a2e08e36d" (UID: "715a873e-0260-4254-86a6-203a2e08e36d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.145450 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tmj6n" event={"ID":"882c42d4-7650-4f0d-8973-ba9ddcbb6800","Type":"ContainerStarted","Data":"9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676"} Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.148797 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5mfk4" event={"ID":"715a873e-0260-4254-86a6-203a2e08e36d","Type":"ContainerDied","Data":"5a2b7a98bc01d8c201dab8c8f8967d9ecee61c7760bc309116ce45482347cec7"} Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.148830 4901 scope.go:117] "RemoveContainer" containerID="42b825a2fdb927d280c3ca2c5589711db380c64abea2cb7b58c3cb2a1dfbbe35" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.148924 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5mfk4" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.161832 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.161888 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.164507 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715a873e-0260-4254-86a6-203a2e08e36d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.184847 4901 scope.go:117] "RemoveContainer" containerID="44319e661ca4e5ae8790f27189efdb3e72aaf71087648ce53b30d9e8aabd905b" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.187129 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.194780 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5mfk4"] Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.218776 4901 scope.go:117] "RemoveContainer" containerID="68fd3de3c14e0fa8fb682b22d2d7a2d89c07ff9e40187faa2e1fe0f3525bee11" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.264915 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zt89s"] Feb 02 10:54:56 crc kubenswrapper[4901]: E0202 10:54:56.265427 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="extract-utilities" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.265443 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="extract-utilities" Feb 02 10:54:56 crc kubenswrapper[4901]: E0202 10:54:56.265456 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="registry-server" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.265463 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="registry-server" Feb 02 10:54:56 crc kubenswrapper[4901]: E0202 10:54:56.265484 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="extract-content" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.265491 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="extract-content" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.265745 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="715a873e-0260-4254-86a6-203a2e08e36d" containerName="registry-server" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.266419 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.269953 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.299703 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zt89s"] Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.369371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcln\" (UniqueName: \"kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.369975 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.471732 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.472212 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcln\" (UniqueName: \"kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.472532 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.495276 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcln\" (UniqueName: \"kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln\") pod \"root-account-create-update-zt89s\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:56 crc kubenswrapper[4901]: I0202 10:54:56.624492 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zt89s" Feb 02 10:54:57 crc kubenswrapper[4901]: I0202 10:54:57.114106 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zt89s"] Feb 02 10:54:57 crc kubenswrapper[4901]: I0202 10:54:57.706146 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715a873e-0260-4254-86a6-203a2e08e36d" path="/var/lib/kubelet/pods/715a873e-0260-4254-86a6-203a2e08e36d/volumes" Feb 02 10:54:58 crc kubenswrapper[4901]: I0202 10:54:58.174935 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zt89s" event={"ID":"62cb28c0-75ca-46e8-ad3d-3f282a4d632a","Type":"ContainerStarted","Data":"75864deb48852f41d5b9735292d4ba2b3e34564e42a8c35945078f13e04c885d"} Feb 02 10:54:58 crc kubenswrapper[4901]: I0202 10:54:58.542021 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 02 10:54:58 crc kubenswrapper[4901]: I0202 10:54:58.618145 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:54:58 crc kubenswrapper[4901]: E0202 10:54:58.618800 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:54:58 crc kubenswrapper[4901]: E0202 10:54:58.618829 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:54:58 crc kubenswrapper[4901]: E0202 10:54:58.618888 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:55:06.618868876 +0000 UTC m=+993.637208982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:54:58 crc kubenswrapper[4901]: I0202 10:54:58.624368 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.010787 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.076192 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.076536 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-twwlw" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="dnsmasq-dns" containerID="cri-o://0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e" gracePeriod=10 Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.197067 4901 generic.go:334] "Generic (PLEG): container finished" podID="62cb28c0-75ca-46e8-ad3d-3f282a4d632a" containerID="28a838a1a7c1be9d154c5fe909e3e315eec15bada39c52b38f07fc6bf2145944" exitCode=0 Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.197116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zt89s" event={"ID":"62cb28c0-75ca-46e8-ad3d-3f282a4d632a","Type":"ContainerDied","Data":"28a838a1a7c1be9d154c5fe909e3e315eec15bada39c52b38f07fc6bf2145944"} Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.198674 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tmj6n" event={"ID":"882c42d4-7650-4f0d-8973-ba9ddcbb6800","Type":"ContainerStarted","Data":"76d1fee44c176ee03708ec12bea996a20fee6321cf644aa693b54ba0f3089b4e"} Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.275506 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tmj6n" podStartSLOduration=2.6671272889999997 podStartE2EDuration="6.275487977s" podCreationTimestamp="2026-02-02 10:54:54 +0000 UTC" firstStartedPulling="2026-02-02 10:54:55.522414832 +0000 UTC m=+982.540754948" lastFinishedPulling="2026-02-02 10:54:59.13077552 +0000 UTC m=+986.149115636" observedRunningTime="2026-02-02 10:55:00.271134769 +0000 UTC m=+987.289474885" watchObservedRunningTime="2026-02-02 10:55:00.275487977 +0000 UTC m=+987.293828073" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.630252 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.657836 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc\") pod \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.657973 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config\") pod \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.658006 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb\") pod \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.658050 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7xrz\" (UniqueName: \"kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz\") pod \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.658118 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb\") pod \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\" (UID: \"5d5168b9-9cea-4a4c-9a44-7fd4779bf302\") " Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.684958 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz" (OuterVolumeSpecName: "kube-api-access-k7xrz") pod "5d5168b9-9cea-4a4c-9a44-7fd4779bf302" (UID: "5d5168b9-9cea-4a4c-9a44-7fd4779bf302"). InnerVolumeSpecName "kube-api-access-k7xrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.719054 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d5168b9-9cea-4a4c-9a44-7fd4779bf302" (UID: "5d5168b9-9cea-4a4c-9a44-7fd4779bf302"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.723512 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d5168b9-9cea-4a4c-9a44-7fd4779bf302" (UID: "5d5168b9-9cea-4a4c-9a44-7fd4779bf302"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.724704 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config" (OuterVolumeSpecName: "config") pod "5d5168b9-9cea-4a4c-9a44-7fd4779bf302" (UID: "5d5168b9-9cea-4a4c-9a44-7fd4779bf302"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.738225 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d5168b9-9cea-4a4c-9a44-7fd4779bf302" (UID: "5d5168b9-9cea-4a4c-9a44-7fd4779bf302"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.760119 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.760164 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.760173 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.760184 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7xrz\" (UniqueName: \"kubernetes.io/projected/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-kube-api-access-k7xrz\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:00 crc kubenswrapper[4901]: I0202 10:55:00.760194 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d5168b9-9cea-4a4c-9a44-7fd4779bf302-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.212575 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-twwlw" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.212635 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-twwlw" event={"ID":"5d5168b9-9cea-4a4c-9a44-7fd4779bf302","Type":"ContainerDied","Data":"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e"} Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.212719 4901 scope.go:117] "RemoveContainer" containerID="0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.212641 4901 generic.go:334] "Generic (PLEG): container finished" podID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerID="0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e" exitCode=0 Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.213036 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-twwlw" event={"ID":"5d5168b9-9cea-4a4c-9a44-7fd4779bf302","Type":"ContainerDied","Data":"b2fcb43428d3c789d2f9a8189a0d1e41b067a3783779a075451b2e161cfe6431"} Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.256807 4901 scope.go:117] "RemoveContainer" containerID="009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.262331 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.269026 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-twwlw"] Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.286779 4901 scope.go:117] "RemoveContainer" containerID="0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e" Feb 02 10:55:01 crc kubenswrapper[4901]: E0202 10:55:01.287659 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e\": container with ID starting with 0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e not found: ID does not exist" containerID="0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.287696 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e"} err="failed to get container status \"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e\": rpc error: code = NotFound desc = could not find container \"0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e\": container with ID starting with 0a4e39e8ccab43a0509b387636de5026bfba1b5c43e74f1ac7a3dbb503573c6e not found: ID does not exist" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.287720 4901 scope.go:117] "RemoveContainer" containerID="009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82" Feb 02 10:55:01 crc kubenswrapper[4901]: E0202 10:55:01.288041 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82\": container with ID starting with 009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82 not found: ID does not exist" containerID="009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.288101 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82"} err="failed to get container status \"009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82\": rpc error: code = NotFound desc = could not find container \"009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82\": container with ID starting with 009305dbe20d978483def2fc5037aa7fb36ea2bda9933a0734aac44f5905ae82 not found: ID does not exist" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.595069 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zt89s" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.684052 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmcln\" (UniqueName: \"kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln\") pod \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.684297 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts\") pod \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\" (UID: \"62cb28c0-75ca-46e8-ad3d-3f282a4d632a\") " Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.686078 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62cb28c0-75ca-46e8-ad3d-3f282a4d632a" (UID: "62cb28c0-75ca-46e8-ad3d-3f282a4d632a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.690524 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" path="/var/lib/kubelet/pods/5d5168b9-9cea-4a4c-9a44-7fd4779bf302/volumes" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.691982 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln" (OuterVolumeSpecName: "kube-api-access-tmcln") pod "62cb28c0-75ca-46e8-ad3d-3f282a4d632a" (UID: "62cb28c0-75ca-46e8-ad3d-3f282a4d632a"). InnerVolumeSpecName "kube-api-access-tmcln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.787590 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmcln\" (UniqueName: \"kubernetes.io/projected/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-kube-api-access-tmcln\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:01 crc kubenswrapper[4901]: I0202 10:55:01.787628 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cb28c0-75ca-46e8-ad3d-3f282a4d632a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:02 crc kubenswrapper[4901]: I0202 10:55:02.223712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zt89s" event={"ID":"62cb28c0-75ca-46e8-ad3d-3f282a4d632a","Type":"ContainerDied","Data":"75864deb48852f41d5b9735292d4ba2b3e34564e42a8c35945078f13e04c885d"} Feb 02 10:55:02 crc kubenswrapper[4901]: I0202 10:55:02.224077 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75864deb48852f41d5b9735292d4ba2b3e34564e42a8c35945078f13e04c885d" Feb 02 10:55:02 crc kubenswrapper[4901]: I0202 10:55:02.224155 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zt89s" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.165855 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6xc42"] Feb 02 10:55:03 crc kubenswrapper[4901]: E0202 10:55:03.166259 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="init" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.166276 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="init" Feb 02 10:55:03 crc kubenswrapper[4901]: E0202 10:55:03.166316 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="dnsmasq-dns" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.166328 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="dnsmasq-dns" Feb 02 10:55:03 crc kubenswrapper[4901]: E0202 10:55:03.166348 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cb28c0-75ca-46e8-ad3d-3f282a4d632a" containerName="mariadb-account-create-update" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.166357 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cb28c0-75ca-46e8-ad3d-3f282a4d632a" containerName="mariadb-account-create-update" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.166583 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="62cb28c0-75ca-46e8-ad3d-3f282a4d632a" containerName="mariadb-account-create-update" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.166616 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5168b9-9cea-4a4c-9a44-7fd4779bf302" containerName="dnsmasq-dns" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.167318 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.181411 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6xc42"] Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.215349 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flg9z\" (UniqueName: \"kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.215471 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.317784 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flg9z\" (UniqueName: \"kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.317880 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.320645 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.340688 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flg9z\" (UniqueName: \"kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z\") pod \"glance-db-create-6xc42\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.378851 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2ac1-account-create-update-q6b8n"] Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.379918 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.382013 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.418861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48j8t\" (UniqueName: \"kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.418935 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.432321 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2ac1-account-create-update-q6b8n"] Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.487010 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6xc42" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.520403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.520547 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48j8t\" (UniqueName: \"kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.521240 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.546493 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48j8t\" (UniqueName: \"kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t\") pod \"glance-2ac1-account-create-update-q6b8n\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.696082 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:03 crc kubenswrapper[4901]: I0202 10:55:03.962769 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6xc42"] Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.036841 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.181640 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2ac1-account-create-update-q6b8n"] Feb 02 10:55:04 crc kubenswrapper[4901]: W0202 10:55:04.184983 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61290422_4438_485f_beaa_27fbcfdb1ea2.slice/crio-f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779 WatchSource:0}: Error finding container f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779: Status 404 returned error can't find the container with id f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779 Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.275712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6xc42" event={"ID":"76a8fa42-4cac-4605-969b-7f5a2e55d7ad","Type":"ContainerStarted","Data":"f2555d25f9b9a661efeae92296056ddbdbb2475b7a261e776ed1ec9f9ce58f73"} Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.275781 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6xc42" event={"ID":"76a8fa42-4cac-4605-969b-7f5a2e55d7ad","Type":"ContainerStarted","Data":"cdc8140bea0ec28cb38704fbd83600c679e58daca2a3667889ac98cfdf0629b7"} Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.280616 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2ac1-account-create-update-q6b8n" event={"ID":"61290422-4438-485f-beaa-27fbcfdb1ea2","Type":"ContainerStarted","Data":"f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779"} Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.295532 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-6xc42" podStartSLOduration=1.295513764 podStartE2EDuration="1.295513764s" podCreationTimestamp="2026-02-02 10:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:04.292206961 +0000 UTC m=+991.310547057" watchObservedRunningTime="2026-02-02 10:55:04.295513764 +0000 UTC m=+991.313853860" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.765335 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zt89s"] Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.771740 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zt89s"] Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.858967 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q5qqw"] Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.860641 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.863536 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.866518 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q5qqw"] Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.868998 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.869048 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mg4\" (UniqueName: \"kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.971271 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.971337 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mg4\" (UniqueName: \"kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.972201 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:04 crc kubenswrapper[4901]: I0202 10:55:04.993001 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mg4\" (UniqueName: \"kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4\") pod \"root-account-create-update-q5qqw\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.194603 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.319091 4901 generic.go:334] "Generic (PLEG): container finished" podID="76a8fa42-4cac-4605-969b-7f5a2e55d7ad" containerID="f2555d25f9b9a661efeae92296056ddbdbb2475b7a261e776ed1ec9f9ce58f73" exitCode=0 Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.319185 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6xc42" event={"ID":"76a8fa42-4cac-4605-969b-7f5a2e55d7ad","Type":"ContainerDied","Data":"f2555d25f9b9a661efeae92296056ddbdbb2475b7a261e776ed1ec9f9ce58f73"} Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.321472 4901 generic.go:334] "Generic (PLEG): container finished" podID="61290422-4438-485f-beaa-27fbcfdb1ea2" containerID="b6efe5eb1d61bb0403e2b600ead2829766e21d0d782223ed4a922a3964a5dc36" exitCode=0 Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.321533 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2ac1-account-create-update-q6b8n" event={"ID":"61290422-4438-485f-beaa-27fbcfdb1ea2","Type":"ContainerDied","Data":"b6efe5eb1d61bb0403e2b600ead2829766e21d0d782223ed4a922a3964a5dc36"} Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.327286 4901 generic.go:334] "Generic (PLEG): container finished" podID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerID="a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab" exitCode=0 Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.327324 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerDied","Data":"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab"} Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.674109 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q5qqw"] Feb 02 10:55:05 crc kubenswrapper[4901]: I0202 10:55:05.702005 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62cb28c0-75ca-46e8-ad3d-3f282a4d632a" path="/var/lib/kubelet/pods/62cb28c0-75ca-46e8-ad3d-3f282a4d632a/volumes" Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.335295 4901 generic.go:334] "Generic (PLEG): container finished" podID="4cc410cd-32c8-45d4-8d51-a82004573724" containerID="eed61737ceb58cd57445bc8e10c8269d9bfc0454083e2f168c847199f7ed3f04" exitCode=0 Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.336453 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q5qqw" event={"ID":"4cc410cd-32c8-45d4-8d51-a82004573724","Type":"ContainerDied","Data":"eed61737ceb58cd57445bc8e10c8269d9bfc0454083e2f168c847199f7ed3f04"} Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.336540 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q5qqw" event={"ID":"4cc410cd-32c8-45d4-8d51-a82004573724","Type":"ContainerStarted","Data":"c3e2720485b767b85247f90384e70df3d4dc74d86b080e8290b78d62990ebd2c"} Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.339173 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerStarted","Data":"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d"} Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.339709 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.418541 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.71642112 podStartE2EDuration="1m3.418519091s" podCreationTimestamp="2026-02-02 10:54:03 +0000 UTC" firstStartedPulling="2026-02-02 10:54:05.268436404 +0000 UTC m=+932.286776500" lastFinishedPulling="2026-02-02 10:54:30.970534345 +0000 UTC m=+957.988874471" observedRunningTime="2026-02-02 10:55:06.410684475 +0000 UTC m=+993.429024571" watchObservedRunningTime="2026-02-02 10:55:06.418519091 +0000 UTC m=+993.436859197" Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.701384 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:55:06 crc kubenswrapper[4901]: E0202 10:55:06.701598 4901 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:55:06 crc kubenswrapper[4901]: E0202 10:55:06.701620 4901 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:55:06 crc kubenswrapper[4901]: E0202 10:55:06.701681 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift podName:4b4d5a91-d330-499c-9123-35b58d8c55d5 nodeName:}" failed. No retries permitted until 2026-02-02 10:55:22.701662847 +0000 UTC m=+1009.720002943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift") pod "swift-storage-0" (UID: "4b4d5a91-d330-499c-9123-35b58d8c55d5") : configmap "swift-ring-files" not found Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.818128 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6xc42" Feb 02 10:55:06 crc kubenswrapper[4901]: I0202 10:55:06.831103 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.005589 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flg9z\" (UniqueName: \"kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z\") pod \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.005643 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48j8t\" (UniqueName: \"kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t\") pod \"61290422-4438-485f-beaa-27fbcfdb1ea2\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.005683 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts\") pod \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\" (UID: \"76a8fa42-4cac-4605-969b-7f5a2e55d7ad\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.005704 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts\") pod \"61290422-4438-485f-beaa-27fbcfdb1ea2\" (UID: \"61290422-4438-485f-beaa-27fbcfdb1ea2\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.006270 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76a8fa42-4cac-4605-969b-7f5a2e55d7ad" (UID: "76a8fa42-4cac-4605-969b-7f5a2e55d7ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.006370 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61290422-4438-485f-beaa-27fbcfdb1ea2" (UID: "61290422-4438-485f-beaa-27fbcfdb1ea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.011152 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z" (OuterVolumeSpecName: "kube-api-access-flg9z") pod "76a8fa42-4cac-4605-969b-7f5a2e55d7ad" (UID: "76a8fa42-4cac-4605-969b-7f5a2e55d7ad"). InnerVolumeSpecName "kube-api-access-flg9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.011244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t" (OuterVolumeSpecName: "kube-api-access-48j8t") pod "61290422-4438-485f-beaa-27fbcfdb1ea2" (UID: "61290422-4438-485f-beaa-27fbcfdb1ea2"). InnerVolumeSpecName "kube-api-access-48j8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.107990 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flg9z\" (UniqueName: \"kubernetes.io/projected/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-kube-api-access-flg9z\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.108031 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48j8t\" (UniqueName: \"kubernetes.io/projected/61290422-4438-485f-beaa-27fbcfdb1ea2-kube-api-access-48j8t\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.108040 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a8fa42-4cac-4605-969b-7f5a2e55d7ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.108053 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61290422-4438-485f-beaa-27fbcfdb1ea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.349650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6xc42" event={"ID":"76a8fa42-4cac-4605-969b-7f5a2e55d7ad","Type":"ContainerDied","Data":"cdc8140bea0ec28cb38704fbd83600c679e58daca2a3667889ac98cfdf0629b7"} Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.349698 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdc8140bea0ec28cb38704fbd83600c679e58daca2a3667889ac98cfdf0629b7" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.349679 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6xc42" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.351865 4901 generic.go:334] "Generic (PLEG): container finished" podID="882c42d4-7650-4f0d-8973-ba9ddcbb6800" containerID="76d1fee44c176ee03708ec12bea996a20fee6321cf644aa693b54ba0f3089b4e" exitCode=0 Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.351942 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tmj6n" event={"ID":"882c42d4-7650-4f0d-8973-ba9ddcbb6800","Type":"ContainerDied","Data":"76d1fee44c176ee03708ec12bea996a20fee6321cf644aa693b54ba0f3089b4e"} Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.354133 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2ac1-account-create-update-q6b8n" event={"ID":"61290422-4438-485f-beaa-27fbcfdb1ea2","Type":"ContainerDied","Data":"f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779"} Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.354165 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f52a825bc75a750f223bad3a3fbb5eb2c65e42d4d843f4aa6706bdf524593779" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.354314 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2ac1-account-create-update-q6b8n" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.394382 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-z99sf"] Feb 02 10:55:07 crc kubenswrapper[4901]: E0202 10:55:07.394843 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61290422-4438-485f-beaa-27fbcfdb1ea2" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.394866 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="61290422-4438-485f-beaa-27fbcfdb1ea2" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: E0202 10:55:07.394885 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a8fa42-4cac-4605-969b-7f5a2e55d7ad" containerName="mariadb-database-create" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.394899 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a8fa42-4cac-4605-969b-7f5a2e55d7ad" containerName="mariadb-database-create" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.395134 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="61290422-4438-485f-beaa-27fbcfdb1ea2" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.395158 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a8fa42-4cac-4605-969b-7f5a2e55d7ad" containerName="mariadb-database-create" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.395990 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.404111 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z99sf"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.491854 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9c55-account-create-update-x27xj"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.493339 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.495252 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.520039 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g82r\" (UniqueName: \"kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.520116 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.521843 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c55-account-create-update-x27xj"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.621625 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g82r\" (UniqueName: \"kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.621716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.621759 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnq7h\" (UniqueName: \"kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.621821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.622693 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.641519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g82r\" (UniqueName: \"kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r\") pod \"keystone-db-create-z99sf\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.696717 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zvnzt"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.698057 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.723797 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnq7h\" (UniqueName: \"kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.723893 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.725205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.726796 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zvnzt"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.734068 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.743914 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.762015 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b701-account-create-update-4l6hg"] Feb 02 10:55:07 crc kubenswrapper[4901]: E0202 10:55:07.771587 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc410cd-32c8-45d4-8d51-a82004573724" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.771608 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc410cd-32c8-45d4-8d51-a82004573724" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.771809 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc410cd-32c8-45d4-8d51-a82004573724" containerName="mariadb-account-create-update" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.766395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnq7h\" (UniqueName: \"kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h\") pod \"keystone-9c55-account-create-update-x27xj\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.772369 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.775869 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.825913 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts\") pod \"4cc410cd-32c8-45d4-8d51-a82004573724\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.826213 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7mg4\" (UniqueName: \"kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4\") pod \"4cc410cd-32c8-45d4-8d51-a82004573724\" (UID: \"4cc410cd-32c8-45d4-8d51-a82004573724\") " Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.826595 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.826689 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmhr\" (UniqueName: \"kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.827672 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cc410cd-32c8-45d4-8d51-a82004573724" (UID: "4cc410cd-32c8-45d4-8d51-a82004573724"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.829871 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.840749 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.840838 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4" (OuterVolumeSpecName: "kube-api-access-j7mg4") pod "4cc410cd-32c8-45d4-8d51-a82004573724" (UID: "4cc410cd-32c8-45d4-8d51-a82004573724"). InnerVolumeSpecName "kube-api-access-j7mg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.840941 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.851066 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b701-account-create-update-4l6hg"] Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8nd\" (UniqueName: \"kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928556 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928624 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928665 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmhr\" (UniqueName: \"kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928706 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc410cd-32c8-45d4-8d51-a82004573724-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.928716 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7mg4\" (UniqueName: \"kubernetes.io/projected/4cc410cd-32c8-45d4-8d51-a82004573724-kube-api-access-j7mg4\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.929289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:07 crc kubenswrapper[4901]: I0202 10:55:07.951252 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmhr\" (UniqueName: \"kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr\") pod \"placement-db-create-zvnzt\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.030777 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8nd\" (UniqueName: \"kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.030868 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.031809 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.047940 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8nd\" (UniqueName: \"kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd\") pod \"placement-b701-account-create-update-4l6hg\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.061576 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.185956 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.250816 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z99sf"] Feb 02 10:55:08 crc kubenswrapper[4901]: W0202 10:55:08.259996 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c5064f5_9947_45f6_9f83_97dcdbfbc466.slice/crio-8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e WatchSource:0}: Error finding container 8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e: Status 404 returned error can't find the container with id 8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.287648 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zvnzt"] Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.365000 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z99sf" event={"ID":"5c5064f5-9947-45f6-9f83-97dcdbfbc466","Type":"ContainerStarted","Data":"8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e"} Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.366040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zvnzt" event={"ID":"2dcbca9e-ef0b-443e-9594-95dfee5ab743","Type":"ContainerStarted","Data":"3965732b73b7110ba97d312d6f1ee02b402fc0f75e7c57fbac79f6bc72a2e555"} Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.367436 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q5qqw" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.367478 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q5qqw" event={"ID":"4cc410cd-32c8-45d4-8d51-a82004573724","Type":"ContainerDied","Data":"c3e2720485b767b85247f90384e70df3d4dc74d86b080e8290b78d62990ebd2c"} Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.367500 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e2720485b767b85247f90384e70df3d4dc74d86b080e8290b78d62990ebd2c" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.371738 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9c55-account-create-update-x27xj"] Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.379129 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rtv7m" podUID="bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:55:08 crc kubenswrapper[4901]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:55:08 crc kubenswrapper[4901]: > Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.523911 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b701-account-create-update-4l6hg"] Feb 02 10:55:08 crc kubenswrapper[4901]: W0202 10:55:08.539892 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd896dd47_8297_41aa_bd3e_9a42a936b474.slice/crio-d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5 WatchSource:0}: Error finding container d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5: Status 404 returned error can't find the container with id d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5 Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.612734 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-l8vxk"] Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.613818 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.616506 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.616729 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dqjmw" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.624582 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l8vxk"] Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.745956 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcz8g\" (UniqueName: \"kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.746092 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.746115 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.746177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.847978 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.848368 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.848427 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.848547 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcz8g\" (UniqueName: \"kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.854333 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.854995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.855939 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.882878 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcz8g\" (UniqueName: \"kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g\") pod \"glance-db-sync-l8vxk\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:08 crc kubenswrapper[4901]: I0202 10:55:08.940621 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.117706 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.255503 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.255977 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.256074 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.256110 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.256154 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.256296 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpj7f\" (UniqueName: \"kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.256326 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts\") pod \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\" (UID: \"882c42d4-7650-4f0d-8973-ba9ddcbb6800\") " Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.258230 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.258508 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.268227 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f" (OuterVolumeSpecName: "kube-api-access-qpj7f") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "kube-api-access-qpj7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.270123 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.284741 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts" (OuterVolumeSpecName: "scripts") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.289591 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.296088 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "882c42d4-7650-4f0d-8973-ba9ddcbb6800" (UID: "882c42d4-7650-4f0d-8973-ba9ddcbb6800"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358185 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358219 4901 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358230 4901 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/882c42d4-7650-4f0d-8973-ba9ddcbb6800-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358239 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpj7f\" (UniqueName: \"kubernetes.io/projected/882c42d4-7650-4f0d-8973-ba9ddcbb6800-kube-api-access-qpj7f\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358249 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358257 4901 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/882c42d4-7650-4f0d-8973-ba9ddcbb6800-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.358266 4901 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/882c42d4-7650-4f0d-8973-ba9ddcbb6800-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.376938 4901 generic.go:334] "Generic (PLEG): container finished" podID="6105ae6c-7921-41a9-ad59-2b43c6ab77ed" containerID="87a8b12378b6a38b8b5b8a1a70f98ca49c41fda10dbfa2f7ece5f88449ffe61f" exitCode=0 Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.376996 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c55-account-create-update-x27xj" event={"ID":"6105ae6c-7921-41a9-ad59-2b43c6ab77ed","Type":"ContainerDied","Data":"87a8b12378b6a38b8b5b8a1a70f98ca49c41fda10dbfa2f7ece5f88449ffe61f"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.377026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c55-account-create-update-x27xj" event={"ID":"6105ae6c-7921-41a9-ad59-2b43c6ab77ed","Type":"ContainerStarted","Data":"9cea704f0d992db052106f63050871af3faf4f0f4b182833a40c12d7a7236057"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.379473 4901 generic.go:334] "Generic (PLEG): container finished" podID="5c5064f5-9947-45f6-9f83-97dcdbfbc466" containerID="785dccd879c42a6854d2f1984c9a3eab6a684189e4071cb276e88237ca017987" exitCode=0 Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.379592 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z99sf" event={"ID":"5c5064f5-9947-45f6-9f83-97dcdbfbc466","Type":"ContainerDied","Data":"785dccd879c42a6854d2f1984c9a3eab6a684189e4071cb276e88237ca017987"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.381493 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tmj6n" event={"ID":"882c42d4-7650-4f0d-8973-ba9ddcbb6800","Type":"ContainerDied","Data":"9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.381530 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9941ff3d68a017669b7b11286f9f5f429e2159e53acc3472bfbdfc38339a4676" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.381529 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmj6n" Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.382852 4901 generic.go:334] "Generic (PLEG): container finished" podID="2dcbca9e-ef0b-443e-9594-95dfee5ab743" containerID="d630bd8f46cda54a28d67df0152b438f7b7d25ea08f5907505935c6fc4db98cd" exitCode=0 Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.382917 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zvnzt" event={"ID":"2dcbca9e-ef0b-443e-9594-95dfee5ab743","Type":"ContainerDied","Data":"d630bd8f46cda54a28d67df0152b438f7b7d25ea08f5907505935c6fc4db98cd"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.384639 4901 generic.go:334] "Generic (PLEG): container finished" podID="d896dd47-8297-41aa-bd3e-9a42a936b474" containerID="aaf7bc852cfecd9b709247a307a638e69d2d1bcd79f1c39979e4323aed10be70" exitCode=0 Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.384669 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b701-account-create-update-4l6hg" event={"ID":"d896dd47-8297-41aa-bd3e-9a42a936b474","Type":"ContainerDied","Data":"aaf7bc852cfecd9b709247a307a638e69d2d1bcd79f1c39979e4323aed10be70"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.384707 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b701-account-create-update-4l6hg" event={"ID":"d896dd47-8297-41aa-bd3e-9a42a936b474","Type":"ContainerStarted","Data":"d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5"} Feb 02 10:55:09 crc kubenswrapper[4901]: I0202 10:55:09.511359 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-l8vxk"] Feb 02 10:55:09 crc kubenswrapper[4901]: W0202 10:55:09.525499 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b582a5_a4bf_4c36_974a_0cf96389bb90.slice/crio-cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668 WatchSource:0}: Error finding container cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668: Status 404 returned error can't find the container with id cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668 Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.394009 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8vxk" event={"ID":"06b582a5-a4bf-4c36-974a-0cf96389bb90","Type":"ContainerStarted","Data":"cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668"} Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.395352 4901 generic.go:334] "Generic (PLEG): container finished" podID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerID="db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2" exitCode=0 Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.395505 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerDied","Data":"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2"} Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.930657 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.987807 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts\") pod \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.988081 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnq7h\" (UniqueName: \"kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h\") pod \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\" (UID: \"6105ae6c-7921-41a9-ad59-2b43c6ab77ed\") " Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.989135 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6105ae6c-7921-41a9-ad59-2b43c6ab77ed" (UID: "6105ae6c-7921-41a9-ad59-2b43c6ab77ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:10 crc kubenswrapper[4901]: I0202 10:55:10.998984 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h" (OuterVolumeSpecName: "kube-api-access-rnq7h") pod "6105ae6c-7921-41a9-ad59-2b43c6ab77ed" (UID: "6105ae6c-7921-41a9-ad59-2b43c6ab77ed"). InnerVolumeSpecName "kube-api-access-rnq7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.045193 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.057236 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.061524 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.090116 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g82r\" (UniqueName: \"kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r\") pod \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.090239 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts\") pod \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\" (UID: \"5c5064f5-9947-45f6-9f83-97dcdbfbc466\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.090634 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnq7h\" (UniqueName: \"kubernetes.io/projected/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-kube-api-access-rnq7h\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.090653 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6105ae6c-7921-41a9-ad59-2b43c6ab77ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.095583 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r" (OuterVolumeSpecName: "kube-api-access-4g82r") pod "5c5064f5-9947-45f6-9f83-97dcdbfbc466" (UID: "5c5064f5-9947-45f6-9f83-97dcdbfbc466"). InnerVolumeSpecName "kube-api-access-4g82r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.096589 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c5064f5-9947-45f6-9f83-97dcdbfbc466" (UID: "5c5064f5-9947-45f6-9f83-97dcdbfbc466"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.191731 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsmhr\" (UniqueName: \"kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr\") pod \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.191869 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l8nd\" (UniqueName: \"kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd\") pod \"d896dd47-8297-41aa-bd3e-9a42a936b474\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.192470 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts\") pod \"d896dd47-8297-41aa-bd3e-9a42a936b474\" (UID: \"d896dd47-8297-41aa-bd3e-9a42a936b474\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.192502 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts\") pod \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\" (UID: \"2dcbca9e-ef0b-443e-9594-95dfee5ab743\") " Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.192944 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d896dd47-8297-41aa-bd3e-9a42a936b474" (UID: "d896dd47-8297-41aa-bd3e-9a42a936b474"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.193069 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dcbca9e-ef0b-443e-9594-95dfee5ab743" (UID: "2dcbca9e-ef0b-443e-9594-95dfee5ab743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.193394 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c5064f5-9947-45f6-9f83-97dcdbfbc466-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.193413 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d896dd47-8297-41aa-bd3e-9a42a936b474-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.193421 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dcbca9e-ef0b-443e-9594-95dfee5ab743-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.193430 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g82r\" (UniqueName: \"kubernetes.io/projected/5c5064f5-9947-45f6-9f83-97dcdbfbc466-kube-api-access-4g82r\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.195313 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd" (OuterVolumeSpecName: "kube-api-access-9l8nd") pod "d896dd47-8297-41aa-bd3e-9a42a936b474" (UID: "d896dd47-8297-41aa-bd3e-9a42a936b474"). InnerVolumeSpecName "kube-api-access-9l8nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.195925 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr" (OuterVolumeSpecName: "kube-api-access-xsmhr") pod "2dcbca9e-ef0b-443e-9594-95dfee5ab743" (UID: "2dcbca9e-ef0b-443e-9594-95dfee5ab743"). InnerVolumeSpecName "kube-api-access-xsmhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.295377 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsmhr\" (UniqueName: \"kubernetes.io/projected/2dcbca9e-ef0b-443e-9594-95dfee5ab743-kube-api-access-xsmhr\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.295421 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l8nd\" (UniqueName: \"kubernetes.io/projected/d896dd47-8297-41aa-bd3e-9a42a936b474-kube-api-access-9l8nd\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.306433 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q5qqw"] Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.312865 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q5qqw"] Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.406870 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerStarted","Data":"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f"} Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.407204 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.420993 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9c55-account-create-update-x27xj" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.422993 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9c55-account-create-update-x27xj" event={"ID":"6105ae6c-7921-41a9-ad59-2b43c6ab77ed","Type":"ContainerDied","Data":"9cea704f0d992db052106f63050871af3faf4f0f4b182833a40c12d7a7236057"} Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.423090 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cea704f0d992db052106f63050871af3faf4f0f4b182833a40c12d7a7236057" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.430188 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371968.42463 podStartE2EDuration="1m8.430146889s" podCreationTimestamp="2026-02-02 10:54:03 +0000 UTC" firstStartedPulling="2026-02-02 10:54:05.782149572 +0000 UTC m=+932.800489668" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:11.426063087 +0000 UTC m=+998.444403183" watchObservedRunningTime="2026-02-02 10:55:11.430146889 +0000 UTC m=+998.448486985" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.433691 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z99sf" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.433731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z99sf" event={"ID":"5c5064f5-9947-45f6-9f83-97dcdbfbc466","Type":"ContainerDied","Data":"8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e"} Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.433777 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c90889ae3c409a73523c5335104a1f8299a942b1a0b1a62bf9157603295cc4e" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.437742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zvnzt" event={"ID":"2dcbca9e-ef0b-443e-9594-95dfee5ab743","Type":"ContainerDied","Data":"3965732b73b7110ba97d312d6f1ee02b402fc0f75e7c57fbac79f6bc72a2e555"} Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.437793 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3965732b73b7110ba97d312d6f1ee02b402fc0f75e7c57fbac79f6bc72a2e555" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.437787 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zvnzt" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.448326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b701-account-create-update-4l6hg" event={"ID":"d896dd47-8297-41aa-bd3e-9a42a936b474","Type":"ContainerDied","Data":"d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5"} Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.448379 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d451fe5fc2044d6c78d7af6074d1e2b1bd963511054b82519a7118d7b0b31ab5" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.448441 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b701-account-create-update-4l6hg" Feb 02 10:55:11 crc kubenswrapper[4901]: I0202 10:55:11.685320 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc410cd-32c8-45d4-8d51-a82004573724" path="/var/lib/kubelet/pods/4cc410cd-32c8-45d4-8d51-a82004573724/volumes" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.335555 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rtv7m" podUID="bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:55:13 crc kubenswrapper[4901]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:55:13 crc kubenswrapper[4901]: > Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.518768 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.519994 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-ckvwh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898079 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rtv7m-config-v9tqh"] Feb 02 10:55:13 crc kubenswrapper[4901]: E0202 10:55:13.898659 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5064f5-9947-45f6-9f83-97dcdbfbc466" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898674 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5064f5-9947-45f6-9f83-97dcdbfbc466" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: E0202 10:55:13.898702 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6105ae6c-7921-41a9-ad59-2b43c6ab77ed" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898713 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6105ae6c-7921-41a9-ad59-2b43c6ab77ed" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: E0202 10:55:13.898736 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcbca9e-ef0b-443e-9594-95dfee5ab743" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898744 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcbca9e-ef0b-443e-9594-95dfee5ab743" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: E0202 10:55:13.898755 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d896dd47-8297-41aa-bd3e-9a42a936b474" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898762 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d896dd47-8297-41aa-bd3e-9a42a936b474" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: E0202 10:55:13.898780 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882c42d4-7650-4f0d-8973-ba9ddcbb6800" containerName="swift-ring-rebalance" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898789 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="882c42d4-7650-4f0d-8973-ba9ddcbb6800" containerName="swift-ring-rebalance" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898954 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6105ae6c-7921-41a9-ad59-2b43c6ab77ed" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898965 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5064f5-9947-45f6-9f83-97dcdbfbc466" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898973 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcbca9e-ef0b-443e-9594-95dfee5ab743" containerName="mariadb-database-create" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898983 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d896dd47-8297-41aa-bd3e-9a42a936b474" containerName="mariadb-account-create-update" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.898991 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="882c42d4-7650-4f0d-8973-ba9ddcbb6800" containerName="swift-ring-rebalance" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.899605 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.906027 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.924900 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rtv7m-config-v9tqh"] Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.952271 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.952341 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.952624 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.952854 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtd9\" (UniqueName: \"kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.952954 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:13 crc kubenswrapper[4901]: I0202 10:55:13.953089 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055500 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055582 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtd9\" (UniqueName: \"kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055714 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055759 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.055815 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.056030 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.056070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.056142 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.056737 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.058235 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.080219 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtd9\" (UniqueName: \"kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9\") pod \"ovn-controller-rtv7m-config-v9tqh\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.223062 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:14 crc kubenswrapper[4901]: I0202 10:55:14.809956 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rtv7m-config-v9tqh"] Feb 02 10:55:15 crc kubenswrapper[4901]: I0202 10:55:15.498323 4901 generic.go:334] "Generic (PLEG): container finished" podID="77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" containerID="138826e28de0fe50677b0c7ee04b318274f4a421844ac049bd374b80c829a1ae" exitCode=0 Feb 02 10:55:15 crc kubenswrapper[4901]: I0202 10:55:15.498973 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rtv7m-config-v9tqh" event={"ID":"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0","Type":"ContainerDied","Data":"138826e28de0fe50677b0c7ee04b318274f4a421844ac049bd374b80c829a1ae"} Feb 02 10:55:15 crc kubenswrapper[4901]: I0202 10:55:15.499007 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rtv7m-config-v9tqh" event={"ID":"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0","Type":"ContainerStarted","Data":"1924e7d00281e4f448b813287d3ef606c73f46c1687b03fdace2b9b600be30f5"} Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.329678 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lrthz"] Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.330935 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.339823 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.348334 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lrthz"] Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.406349 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6vl\" (UniqueName: \"kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.406531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.507536 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.507701 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6vl\" (UniqueName: \"kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.508451 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.531607 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6vl\" (UniqueName: \"kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl\") pod \"root-account-create-update-lrthz\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:16 crc kubenswrapper[4901]: I0202 10:55:16.657416 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:18 crc kubenswrapper[4901]: I0202 10:55:18.324766 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rtv7m" Feb 02 10:55:22 crc kubenswrapper[4901]: I0202 10:55:22.760588 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:55:22 crc kubenswrapper[4901]: I0202 10:55:22.767178 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b4d5a91-d330-499c-9123-35b58d8c55d5-etc-swift\") pod \"swift-storage-0\" (UID: \"4b4d5a91-d330-499c-9123-35b58d8c55d5\") " pod="openstack/swift-storage-0" Feb 02 10:55:22 crc kubenswrapper[4901]: I0202 10:55:22.904896 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.595925 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rtv7m-config-v9tqh" event={"ID":"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0","Type":"ContainerDied","Data":"1924e7d00281e4f448b813287d3ef606c73f46c1687b03fdace2b9b600be30f5"} Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.595982 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1924e7d00281e4f448b813287d3ef606c73f46c1687b03fdace2b9b600be30f5" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.601025 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.673533 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtd9\" (UniqueName: \"kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674102 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674144 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674258 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674376 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn\") pod \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\" (UID: \"77f82838-0340-4b40-9dd7-6d9a4ffaeaf0\") " Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.674982 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.675030 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.675059 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run" (OuterVolumeSpecName: "var-run") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.675759 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.676502 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts" (OuterVolumeSpecName: "scripts") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.680954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9" (OuterVolumeSpecName: "kube-api-access-mbtd9") pod "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" (UID: "77f82838-0340-4b40-9dd7-6d9a4ffaeaf0"). InnerVolumeSpecName "kube-api-access-mbtd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776545 4901 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776593 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776602 4901 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776614 4901 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776627 4901 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:23 crc kubenswrapper[4901]: I0202 10:55:23.776640 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbtd9\" (UniqueName: \"kubernetes.io/projected/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0-kube-api-access-mbtd9\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.011581 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lrthz"] Feb 02 10:55:24 crc kubenswrapper[4901]: W0202 10:55:24.031728 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9aee1d_518e_46d9_8d21_54ea5018453d.slice/crio-c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf WatchSource:0}: Error finding container c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf: Status 404 returned error can't find the container with id c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.172812 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.592783 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.613185 4901 generic.go:334] "Generic (PLEG): container finished" podID="8b9aee1d-518e-46d9-8d21-54ea5018453d" containerID="63ca4ba074a3da8f7885a0b965673af5555913f8f8b59264072a0be11c5cdc09" exitCode=0 Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.613413 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrthz" event={"ID":"8b9aee1d-518e-46d9-8d21-54ea5018453d","Type":"ContainerDied","Data":"63ca4ba074a3da8f7885a0b965673af5555913f8f8b59264072a0be11c5cdc09"} Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.613457 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrthz" event={"ID":"8b9aee1d-518e-46d9-8d21-54ea5018453d","Type":"ContainerStarted","Data":"c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf"} Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.614917 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"f3542a104be6473e25d9bba4e77e1a4ff8233a8cd671dae8fd2e50c8c326c16a"} Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.616903 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rtv7m-config-v9tqh" Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.617089 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8vxk" event={"ID":"06b582a5-a4bf-4c36-974a-0cf96389bb90","Type":"ContainerStarted","Data":"91687dee59c8e5182e1d950ff7fab2790b778116b2f52d461608ad1576841234"} Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.740078 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-l8vxk" podStartSLOduration=2.666634028 podStartE2EDuration="16.740055173s" podCreationTimestamp="2026-02-02 10:55:08 +0000 UTC" firstStartedPulling="2026-02-02 10:55:09.527712875 +0000 UTC m=+996.546052961" lastFinishedPulling="2026-02-02 10:55:23.60113401 +0000 UTC m=+1010.619474106" observedRunningTime="2026-02-02 10:55:24.73468785 +0000 UTC m=+1011.753027956" watchObservedRunningTime="2026-02-02 10:55:24.740055173 +0000 UTC m=+1011.758395269" Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.768911 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rtv7m-config-v9tqh"] Feb 02 10:55:24 crc kubenswrapper[4901]: I0202 10:55:24.775537 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rtv7m-config-v9tqh"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.010129 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.343004 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-q7rkr"] Feb 02 10:55:25 crc kubenswrapper[4901]: E0202 10:55:25.343432 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" containerName="ovn-config" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.343453 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" containerName="ovn-config" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.343914 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" containerName="ovn-config" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.344548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.367205 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q7rkr"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.413157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.413415 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2mr\" (UniqueName: \"kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.454520 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9326-account-create-update-fxmwt"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.455690 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.460058 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.516370 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9326-account-create-update-fxmwt"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.521671 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.521765 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2mr\" (UniqueName: \"kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.521809 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtmt\" (UniqueName: \"kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.521878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.522697 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.555811 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-8ca5-account-create-update-h5xln"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.557001 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.571190 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.571333 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-66nlg"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.572494 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.578605 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xr6pb"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.579463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2mr\" (UniqueName: \"kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr\") pod \"heat-db-create-q7rkr\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.581952 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.589051 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8ca5-account-create-update-h5xln"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.605739 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.605955 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.605969 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.606104 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nx7b8" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.624691 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xr6pb"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.625888 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.625980 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtmt\" (UniqueName: \"kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.626059 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.626144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4xq\" (UniqueName: \"kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.627140 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.657845 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8tpf5"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.659483 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.667524 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-66nlg"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.671195 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.690401 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtmt\" (UniqueName: \"kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt\") pod \"cinder-9326-account-create-update-fxmwt\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.704278 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f82838-0340-4b40-9dd7-6d9a4ffaeaf0" path="/var/lib/kubelet/pods/77f82838-0340-4b40-9dd7-6d9a4ffaeaf0/volumes" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.705254 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-12b8-account-create-update-znw9c"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.707012 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.709265 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.709598 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8tpf5"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.726453 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-12b8-account-create-update-znw9c"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727468 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727622 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4xq\" (UniqueName: \"kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727853 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727947 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9tk2\" (UniqueName: \"kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.727990 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk4w\" (UniqueName: \"kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.728127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.728163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmkk\" (UniqueName: \"kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.734701 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.774430 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4xq\" (UniqueName: \"kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq\") pod \"heat-8ca5-account-create-update-h5xln\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.818622 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.831258 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836062 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9tk2\" (UniqueName: \"kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836104 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crk4w\" (UniqueName: \"kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836173 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfm4n\" (UniqueName: \"kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836272 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmkk\" (UniqueName: \"kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836374 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836428 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.836457 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.832289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.839379 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.843755 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-srtxb"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.844892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.850443 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.851064 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.853764 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-srtxb"] Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.862269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmkk\" (UniqueName: \"kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk\") pod \"cinder-db-create-66nlg\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.863137 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk4w\" (UniqueName: \"kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w\") pod \"keystone-db-sync-xr6pb\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.873843 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9tk2\" (UniqueName: \"kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2\") pod \"barbican-db-create-8tpf5\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.941213 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.941277 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgx9q\" (UniqueName: \"kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.941351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.941450 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfm4n\" (UniqueName: \"kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.943343 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.959417 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:25 crc kubenswrapper[4901]: I0202 10:55:25.965315 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfm4n\" (UniqueName: \"kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n\") pod \"barbican-12b8-account-create-update-znw9c\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.020761 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.022261 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ad2-account-create-update-r98jn"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.023656 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.029214 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.031485 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ad2-account-create-update-r98jn"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.045708 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgx9q\" (UniqueName: \"kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.045786 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.046503 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.068295 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgx9q\" (UniqueName: \"kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q\") pod \"neutron-db-create-srtxb\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.141553 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.142890 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.148333 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwm9l\" (UniqueName: \"kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.148460 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.173105 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.184197 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.212405 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.249197 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts\") pod \"8b9aee1d-518e-46d9-8d21-54ea5018453d\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.249342 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw6vl\" (UniqueName: \"kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl\") pod \"8b9aee1d-518e-46d9-8d21-54ea5018453d\" (UID: \"8b9aee1d-518e-46d9-8d21-54ea5018453d\") " Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.249614 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwm9l\" (UniqueName: \"kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.249742 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.250267 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b9aee1d-518e-46d9-8d21-54ea5018453d" (UID: "8b9aee1d-518e-46d9-8d21-54ea5018453d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.250901 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.256744 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl" (OuterVolumeSpecName: "kube-api-access-mw6vl") pod "8b9aee1d-518e-46d9-8d21-54ea5018453d" (UID: "8b9aee1d-518e-46d9-8d21-54ea5018453d"). InnerVolumeSpecName "kube-api-access-mw6vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.270483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwm9l\" (UniqueName: \"kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l\") pod \"neutron-6ad2-account-create-update-r98jn\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.353615 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw6vl\" (UniqueName: \"kubernetes.io/projected/8b9aee1d-518e-46d9-8d21-54ea5018453d-kube-api-access-mw6vl\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.353962 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9aee1d-518e-46d9-8d21-54ea5018453d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.356865 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.402424 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-q7rkr"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.419917 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9326-account-create-update-fxmwt"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.636340 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-66nlg"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.657473 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-8ca5-account-create-update-h5xln"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.687093 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lrthz" event={"ID":"8b9aee1d-518e-46d9-8d21-54ea5018453d","Type":"ContainerDied","Data":"c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf"} Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.687133 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6373294f60a6b3d46f805b20ef080403c8c45e6e6c9d962effc1f55fd170caf" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.687210 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lrthz" Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.790099 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8tpf5"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.801006 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xr6pb"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.967818 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-srtxb"] Feb 02 10:55:26 crc kubenswrapper[4901]: I0202 10:55:26.977826 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-12b8-account-create-update-znw9c"] Feb 02 10:55:27 crc kubenswrapper[4901]: W0202 10:55:27.075000 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ffd424_d5ce_4321_adc1_638508596191.slice/crio-19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99 WatchSource:0}: Error finding container 19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99: Status 404 returned error can't find the container with id 19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99 Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.739068 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-12b8-account-create-update-znw9c" event={"ID":"14a824c3-d0b9-482d-8f69-0431b9b46f85","Type":"ContainerStarted","Data":"93d552bb5867aac3aa492a0e0cc218e0bbbdbcbdf9dc0d1c4070b63471894334"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.740097 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-12b8-account-create-update-znw9c" event={"ID":"14a824c3-d0b9-482d-8f69-0431b9b46f85","Type":"ContainerStarted","Data":"e2374175f48c34d419d390bed21847356a42e7e74a081762cd47657cd8d81228"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.754842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8tpf5" event={"ID":"72b6486b-a65c-468f-a6fa-4fc229447300","Type":"ContainerStarted","Data":"cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.754882 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8tpf5" event={"ID":"72b6486b-a65c-468f-a6fa-4fc229447300","Type":"ContainerStarted","Data":"3c84a2393ecb978ae77269ca6d2d15b507577817e9fb94b71c924105bc19f9cf"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.760045 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ad2-account-create-update-r98jn"] Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.763701 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66nlg" event={"ID":"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74","Type":"ContainerStarted","Data":"e4b4bd070ff1b6ad9e639dda68b779fd1ad99cfcfbc185e10b9d7fcd59d946e2"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.775673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8ca5-account-create-update-h5xln" event={"ID":"98ffd424-d5ce-4321-adc1-638508596191","Type":"ContainerStarted","Data":"e5da4a708e2b1c5b8df94581e2c6eb489e56e39b519928b1c85f4125397aabc0"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.775728 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8ca5-account-create-update-h5xln" event={"ID":"98ffd424-d5ce-4321-adc1-638508596191","Type":"ContainerStarted","Data":"19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.777826 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-12b8-account-create-update-znw9c" podStartSLOduration=2.77780626 podStartE2EDuration="2.77780626s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.76701864 +0000 UTC m=+1014.785358746" watchObservedRunningTime="2026-02-02 10:55:27.77780626 +0000 UTC m=+1014.796146356" Feb 02 10:55:27 crc kubenswrapper[4901]: W0202 10:55:27.779004 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8879f11a_677c_485b_91dd_082f86fd8d5a.slice/crio-adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627 WatchSource:0}: Error finding container adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627: Status 404 returned error can't find the container with id adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627 Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.779539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xr6pb" event={"ID":"a766edf3-78cd-4939-a63b-8079e261b386","Type":"ContainerStarted","Data":"cb1b31393ec727cba80b04a9c9c921e528ee5a6a2760451d7c7a173b97170700"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.785302 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srtxb" event={"ID":"28588193-ae95-4ea4-a449-614bf3beebc2","Type":"ContainerStarted","Data":"3d0c34e881a6c7fb959cc322dda4334b9da3832652549750ea3182ee2ea0363f"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.785347 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srtxb" event={"ID":"28588193-ae95-4ea4-a449-614bf3beebc2","Type":"ContainerStarted","Data":"8122519f07eb5fc64e30c8cbc6660d6ba0e1b873ee8f0c1116a117142dcf6993"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.807495 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q7rkr" event={"ID":"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5","Type":"ContainerStarted","Data":"ec14d351eb433ac7609c09477faca1a52133f666083c3d67ca158107a31d2bfd"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.808974 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q7rkr" event={"ID":"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5","Type":"ContainerStarted","Data":"72f6bed53368efef61a16025472121bbf928a46df38f052e9bca6dedefd9d1f4"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.809462 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8tpf5" podStartSLOduration=2.809438991 podStartE2EDuration="2.809438991s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.804930138 +0000 UTC m=+1014.823270234" watchObservedRunningTime="2026-02-02 10:55:27.809438991 +0000 UTC m=+1014.827779087" Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.813056 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9326-account-create-update-fxmwt" event={"ID":"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220","Type":"ContainerStarted","Data":"8d6f77b0d29e2378e46e200a1521ba81590822372a570e56bae012b8efc8a22b"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.813120 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9326-account-create-update-fxmwt" event={"ID":"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220","Type":"ContainerStarted","Data":"228c589300e07eaaa98c4e205b1c38ee2434a81a6209a3c351f39614bf63753d"} Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.836929 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-66nlg" podStartSLOduration=2.836905047 podStartE2EDuration="2.836905047s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.836772144 +0000 UTC m=+1014.855112240" watchObservedRunningTime="2026-02-02 10:55:27.836905047 +0000 UTC m=+1014.855245143" Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.885109 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-8ca5-account-create-update-h5xln" podStartSLOduration=2.885079852 podStartE2EDuration="2.885079852s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.865196904 +0000 UTC m=+1014.883537000" watchObservedRunningTime="2026-02-02 10:55:27.885079852 +0000 UTC m=+1014.903419948" Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.917047 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-9326-account-create-update-fxmwt" podStartSLOduration=2.917010649 podStartE2EDuration="2.917010649s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.902716092 +0000 UTC m=+1014.921056188" watchObservedRunningTime="2026-02-02 10:55:27.917010649 +0000 UTC m=+1014.935350745" Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.967550 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-srtxb" podStartSLOduration=2.967521921 podStartE2EDuration="2.967521921s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.942932137 +0000 UTC m=+1014.961272233" watchObservedRunningTime="2026-02-02 10:55:27.967521921 +0000 UTC m=+1014.985862017" Feb 02 10:55:27 crc kubenswrapper[4901]: I0202 10:55:27.981040 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-q7rkr" podStartSLOduration=2.981007028 podStartE2EDuration="2.981007028s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:27.966555818 +0000 UTC m=+1014.984895924" watchObservedRunningTime="2026-02-02 10:55:27.981007028 +0000 UTC m=+1014.999347124" Feb 02 10:55:28 crc kubenswrapper[4901]: E0202 10:55:28.155922 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b6486b_a65c_468f_a6fa_4fc229447300.slice/crio-cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b6486b_a65c_468f_a6fa_4fc229447300.slice/crio-conmon-cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.848260 4901 generic.go:334] "Generic (PLEG): container finished" podID="cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" containerID="36ba7129012ac21b0f824cec190da85f00e005f11f14e8512eee43d12378053b" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.848324 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66nlg" event={"ID":"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74","Type":"ContainerDied","Data":"36ba7129012ac21b0f824cec190da85f00e005f11f14e8512eee43d12378053b"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.850409 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ad2-account-create-update-r98jn" event={"ID":"8879f11a-677c-485b-91dd-082f86fd8d5a","Type":"ContainerStarted","Data":"44fa25142ac132a908fa1ce6f91507673fe357e08065ca786649d64077375802"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.850454 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ad2-account-create-update-r98jn" event={"ID":"8879f11a-677c-485b-91dd-082f86fd8d5a","Type":"ContainerStarted","Data":"adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.854184 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"ca8d93f7543ee34574fc16b84017f94acf0cb511f7ffacf133871c9813bc15bf"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.854217 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"497817940c32266fc14d809678708eeb0b2db8030701fc796cd19fcc545a81b0"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.854228 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"5e0a964946488f3e168fb30c4e7ea7fd2459b9e0cf44ef062a53bd6f2898a51f"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.856105 4901 generic.go:334] "Generic (PLEG): container finished" podID="98ffd424-d5ce-4321-adc1-638508596191" containerID="e5da4a708e2b1c5b8df94581e2c6eb489e56e39b519928b1c85f4125397aabc0" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.856247 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8ca5-account-create-update-h5xln" event={"ID":"98ffd424-d5ce-4321-adc1-638508596191","Type":"ContainerDied","Data":"e5da4a708e2b1c5b8df94581e2c6eb489e56e39b519928b1c85f4125397aabc0"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.862595 4901 generic.go:334] "Generic (PLEG): container finished" podID="28588193-ae95-4ea4-a449-614bf3beebc2" containerID="3d0c34e881a6c7fb959cc322dda4334b9da3832652549750ea3182ee2ea0363f" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.862761 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srtxb" event={"ID":"28588193-ae95-4ea4-a449-614bf3beebc2","Type":"ContainerDied","Data":"3d0c34e881a6c7fb959cc322dda4334b9da3832652549750ea3182ee2ea0363f"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.870821 4901 generic.go:334] "Generic (PLEG): container finished" podID="73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" containerID="ec14d351eb433ac7609c09477faca1a52133f666083c3d67ca158107a31d2bfd" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.870937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q7rkr" event={"ID":"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5","Type":"ContainerDied","Data":"ec14d351eb433ac7609c09477faca1a52133f666083c3d67ca158107a31d2bfd"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.874455 4901 generic.go:334] "Generic (PLEG): container finished" podID="55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" containerID="8d6f77b0d29e2378e46e200a1521ba81590822372a570e56bae012b8efc8a22b" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.874593 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9326-account-create-update-fxmwt" event={"ID":"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220","Type":"ContainerDied","Data":"8d6f77b0d29e2378e46e200a1521ba81590822372a570e56bae012b8efc8a22b"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.878338 4901 generic.go:334] "Generic (PLEG): container finished" podID="14a824c3-d0b9-482d-8f69-0431b9b46f85" containerID="93d552bb5867aac3aa492a0e0cc218e0bbbdbcbdf9dc0d1c4070b63471894334" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.878398 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-12b8-account-create-update-znw9c" event={"ID":"14a824c3-d0b9-482d-8f69-0431b9b46f85","Type":"ContainerDied","Data":"93d552bb5867aac3aa492a0e0cc218e0bbbdbcbdf9dc0d1c4070b63471894334"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.887411 4901 generic.go:334] "Generic (PLEG): container finished" podID="72b6486b-a65c-468f-a6fa-4fc229447300" containerID="cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f" exitCode=0 Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.887479 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8tpf5" event={"ID":"72b6486b-a65c-468f-a6fa-4fc229447300","Type":"ContainerDied","Data":"cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f"} Feb 02 10:55:28 crc kubenswrapper[4901]: I0202 10:55:28.941366 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6ad2-account-create-update-r98jn" podStartSLOduration=3.941343079 podStartE2EDuration="3.941343079s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:28.914850357 +0000 UTC m=+1015.933190453" watchObservedRunningTime="2026-02-02 10:55:28.941343079 +0000 UTC m=+1015.959683175" Feb 02 10:55:29 crc kubenswrapper[4901]: I0202 10:55:29.900269 4901 generic.go:334] "Generic (PLEG): container finished" podID="8879f11a-677c-485b-91dd-082f86fd8d5a" containerID="44fa25142ac132a908fa1ce6f91507673fe357e08065ca786649d64077375802" exitCode=0 Feb 02 10:55:29 crc kubenswrapper[4901]: I0202 10:55:29.900372 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ad2-account-create-update-r98jn" event={"ID":"8879f11a-677c-485b-91dd-082f86fd8d5a","Type":"ContainerDied","Data":"44fa25142ac132a908fa1ce6f91507673fe357e08065ca786649d64077375802"} Feb 02 10:55:29 crc kubenswrapper[4901]: I0202 10:55:29.921952 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"755bf299431aff16843ac8d36751130daf5231538904467102a05b045c13a702"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.182460 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.191197 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.197592 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328089 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts\") pod \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328149 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2mr\" (UniqueName: \"kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr\") pod \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\" (UID: \"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328249 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgx9q\" (UniqueName: \"kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q\") pod \"28588193-ae95-4ea4-a449-614bf3beebc2\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328307 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts\") pod \"98ffd424-d5ce-4321-adc1-638508596191\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328410 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf4xq\" (UniqueName: \"kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq\") pod \"98ffd424-d5ce-4321-adc1-638508596191\" (UID: \"98ffd424-d5ce-4321-adc1-638508596191\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.328472 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts\") pod \"28588193-ae95-4ea4-a449-614bf3beebc2\" (UID: \"28588193-ae95-4ea4-a449-614bf3beebc2\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.329414 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" (UID: "73fb6dad-c6af-44e7-98a9-fe46e2ea41e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.329469 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28588193-ae95-4ea4-a449-614bf3beebc2" (UID: "28588193-ae95-4ea4-a449-614bf3beebc2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.329465 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98ffd424-d5ce-4321-adc1-638508596191" (UID: "98ffd424-d5ce-4321-adc1-638508596191"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.335118 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr" (OuterVolumeSpecName: "kube-api-access-kr2mr") pod "73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" (UID: "73fb6dad-c6af-44e7-98a9-fe46e2ea41e5"). InnerVolumeSpecName "kube-api-access-kr2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.336859 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq" (OuterVolumeSpecName: "kube-api-access-tf4xq") pod "98ffd424-d5ce-4321-adc1-638508596191" (UID: "98ffd424-d5ce-4321-adc1-638508596191"). InnerVolumeSpecName "kube-api-access-tf4xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.337138 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q" (OuterVolumeSpecName: "kube-api-access-hgx9q") pod "28588193-ae95-4ea4-a449-614bf3beebc2" (UID: "28588193-ae95-4ea4-a449-614bf3beebc2"). InnerVolumeSpecName "kube-api-access-hgx9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430155 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgx9q\" (UniqueName: \"kubernetes.io/projected/28588193-ae95-4ea4-a449-614bf3beebc2-kube-api-access-hgx9q\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430575 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ffd424-d5ce-4321-adc1-638508596191-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430588 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf4xq\" (UniqueName: \"kubernetes.io/projected/98ffd424-d5ce-4321-adc1-638508596191-kube-api-access-tf4xq\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430597 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28588193-ae95-4ea4-a449-614bf3beebc2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430606 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.430616 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr2mr\" (UniqueName: \"kubernetes.io/projected/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5-kube-api-access-kr2mr\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.562113 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.568504 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.578122 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.587867 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.622930 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.762327 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts\") pod \"72b6486b-a65c-468f-a6fa-4fc229447300\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.763205 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72b6486b-a65c-468f-a6fa-4fc229447300" (UID: "72b6486b-a65c-468f-a6fa-4fc229447300"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.763546 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rtmt\" (UniqueName: \"kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt\") pod \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.763775 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts\") pod \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.764301 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwm9l\" (UniqueName: \"kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l\") pod \"8879f11a-677c-485b-91dd-082f86fd8d5a\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.764636 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9tk2\" (UniqueName: \"kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2\") pod \"72b6486b-a65c-468f-a6fa-4fc229447300\" (UID: \"72b6486b-a65c-468f-a6fa-4fc229447300\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.764256 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" (UID: "cdf53bca-c9cf-4a55-a7b1-fc983eef2c74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.764939 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfm4n\" (UniqueName: \"kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n\") pod \"14a824c3-d0b9-482d-8f69-0431b9b46f85\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.765359 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts\") pod \"14a824c3-d0b9-482d-8f69-0431b9b46f85\" (UID: \"14a824c3-d0b9-482d-8f69-0431b9b46f85\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.765597 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts\") pod \"8879f11a-677c-485b-91dd-082f86fd8d5a\" (UID: \"8879f11a-677c-485b-91dd-082f86fd8d5a\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.765709 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlmkk\" (UniqueName: \"kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk\") pod \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\" (UID: \"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766026 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts\") pod \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\" (UID: \"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220\") " Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766099 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8879f11a-677c-485b-91dd-082f86fd8d5a" (UID: "8879f11a-677c-485b-91dd-082f86fd8d5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766326 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14a824c3-d0b9-482d-8f69-0431b9b46f85" (UID: "14a824c3-d0b9-482d-8f69-0431b9b46f85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766584 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8879f11a-677c-485b-91dd-082f86fd8d5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766640 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72b6486b-a65c-468f-a6fa-4fc229447300-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.766655 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.767426 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" (UID: "55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.768008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l" (OuterVolumeSpecName: "kube-api-access-fwm9l") pod "8879f11a-677c-485b-91dd-082f86fd8d5a" (UID: "8879f11a-677c-485b-91dd-082f86fd8d5a"). InnerVolumeSpecName "kube-api-access-fwm9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.768546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n" (OuterVolumeSpecName: "kube-api-access-qfm4n") pod "14a824c3-d0b9-482d-8f69-0431b9b46f85" (UID: "14a824c3-d0b9-482d-8f69-0431b9b46f85"). InnerVolumeSpecName "kube-api-access-qfm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.768973 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk" (OuterVolumeSpecName: "kube-api-access-nlmkk") pod "cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" (UID: "cdf53bca-c9cf-4a55-a7b1-fc983eef2c74"). InnerVolumeSpecName "kube-api-access-nlmkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.769818 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt" (OuterVolumeSpecName: "kube-api-access-7rtmt") pod "55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" (UID: "55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220"). InnerVolumeSpecName "kube-api-access-7rtmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.775022 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2" (OuterVolumeSpecName: "kube-api-access-f9tk2") pod "72b6486b-a65c-468f-a6fa-4fc229447300" (UID: "72b6486b-a65c-468f-a6fa-4fc229447300"). InnerVolumeSpecName "kube-api-access-f9tk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868071 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9tk2\" (UniqueName: \"kubernetes.io/projected/72b6486b-a65c-468f-a6fa-4fc229447300-kube-api-access-f9tk2\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868110 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfm4n\" (UniqueName: \"kubernetes.io/projected/14a824c3-d0b9-482d-8f69-0431b9b46f85-kube-api-access-qfm4n\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868121 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14a824c3-d0b9-482d-8f69-0431b9b46f85-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868130 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlmkk\" (UniqueName: \"kubernetes.io/projected/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74-kube-api-access-nlmkk\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868139 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868147 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rtmt\" (UniqueName: \"kubernetes.io/projected/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220-kube-api-access-7rtmt\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.868156 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwm9l\" (UniqueName: \"kubernetes.io/projected/8879f11a-677c-485b-91dd-082f86fd8d5a-kube-api-access-fwm9l\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.982123 4901 generic.go:334] "Generic (PLEG): container finished" podID="06b582a5-a4bf-4c36-974a-0cf96389bb90" containerID="91687dee59c8e5182e1d950ff7fab2790b778116b2f52d461608ad1576841234" exitCode=0 Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.982173 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8vxk" event={"ID":"06b582a5-a4bf-4c36-974a-0cf96389bb90","Type":"ContainerDied","Data":"91687dee59c8e5182e1d950ff7fab2790b778116b2f52d461608ad1576841234"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.986878 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"66128fcbfac38ecda4b10cfee8981df6e398ff1fc6b137bc465fb98288e0c3a2"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.988712 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xr6pb" event={"ID":"a766edf3-78cd-4939-a63b-8079e261b386","Type":"ContainerStarted","Data":"a28a02d0bdc2931d65dc3f556e62817d0ff897942cd58cedfdf5703908e71552"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.991209 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8tpf5" event={"ID":"72b6486b-a65c-468f-a6fa-4fc229447300","Type":"ContainerDied","Data":"3c84a2393ecb978ae77269ca6d2d15b507577817e9fb94b71c924105bc19f9cf"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.991373 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c84a2393ecb978ae77269ca6d2d15b507577817e9fb94b71c924105bc19f9cf" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.991245 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8tpf5" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.993936 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-srtxb" Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.994875 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-srtxb" event={"ID":"28588193-ae95-4ea4-a449-614bf3beebc2","Type":"ContainerDied","Data":"8122519f07eb5fc64e30c8cbc6660d6ba0e1b873ee8f0c1116a117142dcf6993"} Feb 02 10:55:33 crc kubenswrapper[4901]: I0202 10:55:33.994914 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8122519f07eb5fc64e30c8cbc6660d6ba0e1b873ee8f0c1116a117142dcf6993" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.000802 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-q7rkr" event={"ID":"73fb6dad-c6af-44e7-98a9-fe46e2ea41e5","Type":"ContainerDied","Data":"72f6bed53368efef61a16025472121bbf928a46df38f052e9bca6dedefd9d1f4"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.000857 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f6bed53368efef61a16025472121bbf928a46df38f052e9bca6dedefd9d1f4" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.000982 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-q7rkr" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.004801 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9326-account-create-update-fxmwt" event={"ID":"55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220","Type":"ContainerDied","Data":"228c589300e07eaaa98c4e205b1c38ee2434a81a6209a3c351f39614bf63753d"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.004867 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228c589300e07eaaa98c4e205b1c38ee2434a81a6209a3c351f39614bf63753d" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.004872 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9326-account-create-update-fxmwt" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.007254 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-12b8-account-create-update-znw9c" event={"ID":"14a824c3-d0b9-482d-8f69-0431b9b46f85","Type":"ContainerDied","Data":"e2374175f48c34d419d390bed21847356a42e7e74a081762cd47657cd8d81228"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.007297 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2374175f48c34d419d390bed21847356a42e7e74a081762cd47657cd8d81228" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.007383 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-12b8-account-create-update-znw9c" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.011885 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-66nlg" event={"ID":"cdf53bca-c9cf-4a55-a7b1-fc983eef2c74","Type":"ContainerDied","Data":"e4b4bd070ff1b6ad9e639dda68b779fd1ad99cfcfbc185e10b9d7fcd59d946e2"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.011940 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b4bd070ff1b6ad9e639dda68b779fd1ad99cfcfbc185e10b9d7fcd59d946e2" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.012088 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-66nlg" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.016779 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ad2-account-create-update-r98jn" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.016898 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ad2-account-create-update-r98jn" event={"ID":"8879f11a-677c-485b-91dd-082f86fd8d5a","Type":"ContainerDied","Data":"adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.017817 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf7c95aef143414dab5fab98890eb2a2e693049e27406c630631b998e8b6627" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.030609 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-8ca5-account-create-update-h5xln" event={"ID":"98ffd424-d5ce-4321-adc1-638508596191","Type":"ContainerDied","Data":"19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99"} Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.030663 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19bf72488c81d9d06d4957119280b40256689f25a768b1320a28c36fbed4dc99" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.030745 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-8ca5-account-create-update-h5xln" Feb 02 10:55:34 crc kubenswrapper[4901]: I0202 10:55:34.040897 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xr6pb" podStartSLOduration=2.6830900140000002 podStartE2EDuration="9.040873953s" podCreationTimestamp="2026-02-02 10:55:25 +0000 UTC" firstStartedPulling="2026-02-02 10:55:27.126154675 +0000 UTC m=+1014.144494771" lastFinishedPulling="2026-02-02 10:55:33.483938614 +0000 UTC m=+1020.502278710" observedRunningTime="2026-02-02 10:55:34.029985331 +0000 UTC m=+1021.048325447" watchObservedRunningTime="2026-02-02 10:55:34.040873953 +0000 UTC m=+1021.059214059" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.044158 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"9816a760b64a9790b8e0c25fa4b195e588fa62e2d6dad352a31f7f25e96e2a2b"} Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.045878 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"ef73d8244a7b219a27710290f78f1da4bef47995196c81d44fd411f23859e57b"} Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.046081 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"15ed38e3c782e26cfbfb110a378820771dea001fa0a09e645f4a4ce60fd5b08a"} Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.517616 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.652856 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data\") pod \"06b582a5-a4bf-4c36-974a-0cf96389bb90\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.652917 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcz8g\" (UniqueName: \"kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g\") pod \"06b582a5-a4bf-4c36-974a-0cf96389bb90\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.653008 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data\") pod \"06b582a5-a4bf-4c36-974a-0cf96389bb90\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.653052 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle\") pod \"06b582a5-a4bf-4c36-974a-0cf96389bb90\" (UID: \"06b582a5-a4bf-4c36-974a-0cf96389bb90\") " Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.659822 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g" (OuterVolumeSpecName: "kube-api-access-tcz8g") pod "06b582a5-a4bf-4c36-974a-0cf96389bb90" (UID: "06b582a5-a4bf-4c36-974a-0cf96389bb90"). InnerVolumeSpecName "kube-api-access-tcz8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.662667 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "06b582a5-a4bf-4c36-974a-0cf96389bb90" (UID: "06b582a5-a4bf-4c36-974a-0cf96389bb90"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.687394 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b582a5-a4bf-4c36-974a-0cf96389bb90" (UID: "06b582a5-a4bf-4c36-974a-0cf96389bb90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.720791 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data" (OuterVolumeSpecName: "config-data") pod "06b582a5-a4bf-4c36-974a-0cf96389bb90" (UID: "06b582a5-a4bf-4c36-974a-0cf96389bb90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.755313 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.755365 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcz8g\" (UniqueName: \"kubernetes.io/projected/06b582a5-a4bf-4c36-974a-0cf96389bb90-kube-api-access-tcz8g\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.755380 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4901]: I0202 10:55:35.755390 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b582a5-a4bf-4c36-974a-0cf96389bb90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.055723 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-l8vxk" event={"ID":"06b582a5-a4bf-4c36-974a-0cf96389bb90","Type":"ContainerDied","Data":"cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668"} Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.055767 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb4b8c7a68f52bb2b7baee55c2ff1c39f01492e5a1becb5422dbcfc28f17e668" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.055825 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-l8vxk" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.376291 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377395 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377418 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377427 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b582a5-a4bf-4c36-974a-0cf96389bb90" containerName="glance-db-sync" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377434 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b582a5-a4bf-4c36-974a-0cf96389bb90" containerName="glance-db-sync" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377450 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377457 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377472 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28588193-ae95-4ea4-a449-614bf3beebc2" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377478 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="28588193-ae95-4ea4-a449-614bf3beebc2" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377490 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9aee1d-518e-46d9-8d21-54ea5018453d" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377497 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9aee1d-518e-46d9-8d21-54ea5018453d" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377510 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377518 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377530 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a824c3-d0b9-482d-8f69-0431b9b46f85" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377537 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a824c3-d0b9-482d-8f69-0431b9b46f85" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377549 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b6486b-a65c-468f-a6fa-4fc229447300" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377556 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b6486b-a65c-468f-a6fa-4fc229447300" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377592 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8879f11a-677c-485b-91dd-082f86fd8d5a" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377598 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8879f11a-677c-485b-91dd-082f86fd8d5a" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: E0202 10:55:36.377611 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ffd424-d5ce-4321-adc1-638508596191" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377617 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ffd424-d5ce-4321-adc1-638508596191" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377806 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377820 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377830 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="28588193-ae95-4ea4-a449-614bf3beebc2" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377841 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a824c3-d0b9-482d-8f69-0431b9b46f85" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377847 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ffd424-d5ce-4321-adc1-638508596191" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377878 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8879f11a-677c-485b-91dd-082f86fd8d5a" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377887 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9aee1d-518e-46d9-8d21-54ea5018453d" containerName="mariadb-account-create-update" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377911 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b6486b-a65c-468f-a6fa-4fc229447300" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377920 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b582a5-a4bf-4c36-974a-0cf96389bb90" containerName="glance-db-sync" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.377929 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" containerName="mariadb-database-create" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.393449 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.396360 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.476576 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.476648 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.476799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmc99\" (UniqueName: \"kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.476845 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.476934 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.579007 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmc99\" (UniqueName: \"kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.579064 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.579123 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.579211 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.579270 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.580377 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.580969 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.582011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.582143 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.600298 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmc99\" (UniqueName: \"kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99\") pod \"dnsmasq-dns-74dc88fc-2p24z\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:36 crc kubenswrapper[4901]: I0202 10:55:36.720052 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.085269 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"7b194e2853c23253974a617f82668375068987b6c0f6039e3bdd8101059ef04b"} Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.085742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"39b2852005a0b218e23f6d0bb34f1ba7b7feb1efb80d48a60dfcbda76f10ab2f"} Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.085753 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"cb003884085d3f2490b30f89ed5d3dc7ab703a627f942ee3f1225bf2dfa94f15"} Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.305425 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:37 crc kubenswrapper[4901]: W0202 10:55:37.316832 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421ae8ba_9f7c_461b_b757_f2ffccc8284c.slice/crio-4dfbdb6cc6ab9b7f04495d08c217a97908bff6b4610bc2deeae716a86487c040 WatchSource:0}: Error finding container 4dfbdb6cc6ab9b7f04495d08c217a97908bff6b4610bc2deeae716a86487c040: Status 404 returned error can't find the container with id 4dfbdb6cc6ab9b7f04495d08c217a97908bff6b4610bc2deeae716a86487c040 Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.839557 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:55:37 crc kubenswrapper[4901]: I0202 10:55:37.840064 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.095238 4901 generic.go:334] "Generic (PLEG): container finished" podID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerID="9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c" exitCode=0 Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.095337 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" event={"ID":"421ae8ba-9f7c-461b-b757-f2ffccc8284c","Type":"ContainerDied","Data":"9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.095380 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" event={"ID":"421ae8ba-9f7c-461b-b757-f2ffccc8284c","Type":"ContainerStarted","Data":"4dfbdb6cc6ab9b7f04495d08c217a97908bff6b4610bc2deeae716a86487c040"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.098856 4901 generic.go:334] "Generic (PLEG): container finished" podID="a766edf3-78cd-4939-a63b-8079e261b386" containerID="a28a02d0bdc2931d65dc3f556e62817d0ff897942cd58cedfdf5703908e71552" exitCode=0 Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.098950 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xr6pb" event={"ID":"a766edf3-78cd-4939-a63b-8079e261b386","Type":"ContainerDied","Data":"a28a02d0bdc2931d65dc3f556e62817d0ff897942cd58cedfdf5703908e71552"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.114443 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"cbf934233f0a8e65ec7d32b85ed3330b68f5be2db1b8a91353644f34216db537"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.114497 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"dc5e4f12ac95c9f3f6ba9fe8b1ccbaecd782bd7876efbb3f3e6daa454a452c53"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.114508 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"3ea67072f4d7de40fff31b59a99c9cbb0125b9faee9b38338a588b3c9c096e50"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.114518 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4b4d5a91-d330-499c-9123-35b58d8c55d5","Type":"ContainerStarted","Data":"e1fea98d0fa284f9e4a0e24f06d3ee23b35f1033b29eaad3037c3f3d5d5c29ec"} Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.202543 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.252349107 podStartE2EDuration="49.202521018s" podCreationTimestamp="2026-02-02 10:54:49 +0000 UTC" firstStartedPulling="2026-02-02 10:55:24.189005172 +0000 UTC m=+1011.207345268" lastFinishedPulling="2026-02-02 10:55:36.139177083 +0000 UTC m=+1023.157517179" observedRunningTime="2026-02-02 10:55:38.18816096 +0000 UTC m=+1025.206501056" watchObservedRunningTime="2026-02-02 10:55:38.202521018 +0000 UTC m=+1025.220861104" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.492748 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.517416 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.519348 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.528875 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.537965 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.646959 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4lp\" (UniqueName: \"kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.647014 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.647036 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.647065 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.647195 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.647265 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748784 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4lp\" (UniqueName: \"kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748803 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748843 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.748890 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.749689 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.750349 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.750839 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.751026 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.751194 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.769410 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4lp\" (UniqueName: \"kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp\") pod \"dnsmasq-dns-5f59b8f679-tttj2\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:38 crc kubenswrapper[4901]: I0202 10:55:38.844620 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.126708 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" event={"ID":"421ae8ba-9f7c-461b-b757-f2ffccc8284c","Type":"ContainerStarted","Data":"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8"} Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.127163 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.154792 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" podStartSLOduration=3.154766077 podStartE2EDuration="3.154766077s" podCreationTimestamp="2026-02-02 10:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:39.151974658 +0000 UTC m=+1026.170314774" watchObservedRunningTime="2026-02-02 10:55:39.154766077 +0000 UTC m=+1026.173106173" Feb 02 10:55:39 crc kubenswrapper[4901]: W0202 10:55:39.327235 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b8b88a_99e5_4c15_b293_5f2fa80f0a07.slice/crio-5931acc786b66d4600b7da6bdaf1966e56615d20fae93e1f8cdae234761b8c6d WatchSource:0}: Error finding container 5931acc786b66d4600b7da6bdaf1966e56615d20fae93e1f8cdae234761b8c6d: Status 404 returned error can't find the container with id 5931acc786b66d4600b7da6bdaf1966e56615d20fae93e1f8cdae234761b8c6d Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.327664 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.395773 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.460778 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data\") pod \"a766edf3-78cd-4939-a63b-8079e261b386\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.461020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle\") pod \"a766edf3-78cd-4939-a63b-8079e261b386\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.461189 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crk4w\" (UniqueName: \"kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w\") pod \"a766edf3-78cd-4939-a63b-8079e261b386\" (UID: \"a766edf3-78cd-4939-a63b-8079e261b386\") " Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.481826 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w" (OuterVolumeSpecName: "kube-api-access-crk4w") pod "a766edf3-78cd-4939-a63b-8079e261b386" (UID: "a766edf3-78cd-4939-a63b-8079e261b386"). InnerVolumeSpecName "kube-api-access-crk4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.506119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a766edf3-78cd-4939-a63b-8079e261b386" (UID: "a766edf3-78cd-4939-a63b-8079e261b386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.515247 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data" (OuterVolumeSpecName: "config-data") pod "a766edf3-78cd-4939-a63b-8079e261b386" (UID: "a766edf3-78cd-4939-a63b-8079e261b386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.566115 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.570528 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crk4w\" (UniqueName: \"kubernetes.io/projected/a766edf3-78cd-4939-a63b-8079e261b386-kube-api-access-crk4w\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4901]: I0202 10:55:39.570868 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766edf3-78cd-4939-a63b-8079e261b386-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.135610 4901 generic.go:334] "Generic (PLEG): container finished" podID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerID="7c75be964da78ef688ce27fa517630a0cb0c608495a5510b1e1cbf3e1ca7a518" exitCode=0 Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.135727 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" event={"ID":"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07","Type":"ContainerDied","Data":"7c75be964da78ef688ce27fa517630a0cb0c608495a5510b1e1cbf3e1ca7a518"} Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.136219 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" event={"ID":"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07","Type":"ContainerStarted","Data":"5931acc786b66d4600b7da6bdaf1966e56615d20fae93e1f8cdae234761b8c6d"} Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.143734 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xr6pb" event={"ID":"a766edf3-78cd-4939-a63b-8079e261b386","Type":"ContainerDied","Data":"cb1b31393ec727cba80b04a9c9c921e528ee5a6a2760451d7c7a173b97170700"} Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.143783 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb1b31393ec727cba80b04a9c9c921e528ee5a6a2760451d7c7a173b97170700" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.143920 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="dnsmasq-dns" containerID="cri-o://5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8" gracePeriod=10 Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.143953 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xr6pb" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.613670 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-spsdj"] Feb 02 10:55:40 crc kubenswrapper[4901]: E0202 10:55:40.614246 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a766edf3-78cd-4939-a63b-8079e261b386" containerName="keystone-db-sync" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.614262 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a766edf3-78cd-4939-a63b-8079e261b386" containerName="keystone-db-sync" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.614533 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a766edf3-78cd-4939-a63b-8079e261b386" containerName="keystone-db-sync" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.615444 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.620240 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.620595 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nx7b8" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.620761 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.621028 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.621222 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.626104 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.664432 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-spsdj"] Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.674223 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699180 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699233 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699334 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvbz\" (UniqueName: \"kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699355 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699385 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.699432 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.723669 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:40 crc kubenswrapper[4901]: E0202 10:55:40.724082 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="init" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.724094 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="init" Feb 02 10:55:40 crc kubenswrapper[4901]: E0202 10:55:40.724112 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="dnsmasq-dns" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.724119 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="dnsmasq-dns" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.724277 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerName="dnsmasq-dns" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.725222 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.772186 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.801769 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmc99\" (UniqueName: \"kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99\") pod \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.801959 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc\") pod \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802033 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb\") pod \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802122 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config\") pod \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802162 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb\") pod \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\" (UID: \"421ae8ba-9f7c-461b-b757-f2ffccc8284c\") " Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802338 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802362 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802409 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4982\" (UniqueName: \"kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802456 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802504 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802530 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802609 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802629 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802681 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvbz\" (UniqueName: \"kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.802705 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.805400 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.837130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99" (OuterVolumeSpecName: "kube-api-access-bmc99") pod "421ae8ba-9f7c-461b-b757-f2ffccc8284c" (UID: "421ae8ba-9f7c-461b-b757-f2ffccc8284c"). InnerVolumeSpecName "kube-api-access-bmc99". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.844761 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.845386 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.899541 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-hcpj4"] Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.900633 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.905222 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912454 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912498 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912571 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4982\" (UniqueName: \"kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912673 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912791 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912819 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.912982 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmc99\" (UniqueName: \"kubernetes.io/projected/421ae8ba-9f7c-461b-b757-f2ffccc8284c-kube-api-access-bmc99\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.913862 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.915867 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.916778 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.917310 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.918753 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.919516 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.921415 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bb5r6" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.922278 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.976125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.976998 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hcpj4"] Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.980729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvbz\" (UniqueName: \"kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz\") pod \"keystone-bootstrap-spsdj\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:40 crc kubenswrapper[4901]: I0202 10:55:40.998204 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.008973 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4982\" (UniqueName: \"kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982\") pod \"dnsmasq-dns-bbf5cc879-vtqqt\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.018927 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.019017 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.019058 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs7wj\" (UniqueName: \"kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.054383 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "421ae8ba-9f7c-461b-b757-f2ffccc8284c" (UID: "421ae8ba-9f7c-461b-b757-f2ffccc8284c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.075892 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.087297 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config" (OuterVolumeSpecName: "config") pod "421ae8ba-9f7c-461b-b757-f2ffccc8284c" (UID: "421ae8ba-9f7c-461b-b757-f2ffccc8284c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.120923 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.121015 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs7wj\" (UniqueName: \"kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.121073 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.121169 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.121202 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.135549 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2hz6j"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.137140 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.149771 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "421ae8ba-9f7c-461b-b757-f2ffccc8284c" (UID: "421ae8ba-9f7c-461b-b757-f2ffccc8284c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.161430 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.171362 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.212548 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.212759 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.213226 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mzgkh" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.214178 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "421ae8ba-9f7c-461b-b757-f2ffccc8284c" (UID: "421ae8ba-9f7c-461b-b757-f2ffccc8284c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.222753 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.222815 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.222834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbkdq\" (UniqueName: \"kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.222918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.222997 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.223019 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.223072 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.223085 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/421ae8ba-9f7c-461b-b757-f2ffccc8284c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.266230 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2hz6j"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.267389 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs7wj\" (UniqueName: \"kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj\") pod \"heat-db-sync-hcpj4\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.300924 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-h9fw8"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.302130 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.312266 4901 generic.go:334] "Generic (PLEG): container finished" podID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" containerID="5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8" exitCode=0 Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.312533 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" event={"ID":"421ae8ba-9f7c-461b-b757-f2ffccc8284c","Type":"ContainerDied","Data":"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8"} Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.312622 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" event={"ID":"421ae8ba-9f7c-461b-b757-f2ffccc8284c","Type":"ContainerDied","Data":"4dfbdb6cc6ab9b7f04495d08c217a97908bff6b4610bc2deeae716a86487c040"} Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.312643 4901 scope.go:117] "RemoveContainer" containerID="5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.312779 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-2p24z" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.329160 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.330586 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h9fw8"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331235 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbkdq\" (UniqueName: \"kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331275 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331358 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331398 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331420 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331458 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.331541 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-z5cj9" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.334018 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.338580 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.343908 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hcpj4" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.348436 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.348685 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.352693 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" event={"ID":"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07","Type":"ContainerStarted","Data":"02f0f230e214f7fda328fb024dfe131ceab0bdfdd0d8837f4d8d1466ba9ccbdd"} Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.353417 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.355882 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.362221 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.398255 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbkdq\" (UniqueName: \"kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq\") pod \"cinder-db-sync-2hz6j\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.402633 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.405551 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.408969 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.409162 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.433628 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8jt\" (UniqueName: \"kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.433671 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.433743 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.458710 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.469689 4901 scope.go:117] "RemoveContainer" containerID="9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.541554 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.580665 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.580740 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.580822 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.580870 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj54w\" (UniqueName: \"kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581010 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh8jt\" (UniqueName: \"kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581185 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581220 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581243 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.581337 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.616763 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xh5rm"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.622084 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.629423 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fxf28" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.629847 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.633721 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.633911 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.646349 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh8jt\" (UniqueName: \"kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.650763 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xh5rm"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.662419 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.677601 4901 scope.go:117] "RemoveContainer" containerID="5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8" Feb 02 10:55:41 crc kubenswrapper[4901]: E0202 10:55:41.679057 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8\": container with ID starting with 5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8 not found: ID does not exist" containerID="5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.679163 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8"} err="failed to get container status \"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8\": rpc error: code = NotFound desc = could not find container \"5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8\": container with ID starting with 5d7356d9bf02d72817106ed866113ae9c5a99934b18948bbfd34f5338b5adbb8 not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.679212 4901 scope.go:117] "RemoveContainer" containerID="9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c" Feb 02 10:55:41 crc kubenswrapper[4901]: E0202 10:55:41.680017 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c\": container with ID starting with 9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c not found: ID does not exist" containerID="9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.680071 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c"} err="failed to get container status \"9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c\": rpc error: code = NotFound desc = could not find container \"9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c\": container with ID starting with 9ab13a2c9adb483b3504f721c2e20dcb022ea5f6fb3dfc5c811f65b61c0a8b8c not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.731547 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle\") pod \"neutron-db-sync-h9fw8\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848319 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxcvr\" (UniqueName: \"kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848390 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848528 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848582 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848617 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848668 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848704 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848735 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.848751 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.849411 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.849662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.849702 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj54w\" (UniqueName: \"kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.849796 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.853604 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.853756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.854170 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.863625 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.871626 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.881092 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj54w\" (UniqueName: \"kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w\") pod \"ceilometer-0\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.887609 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cdcb9"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.889300 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.892912 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.893117 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9fqtx" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.928424 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" podStartSLOduration=3.928393894 podStartE2EDuration="3.928393894s" podCreationTimestamp="2026-02-02 10:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:41.428795588 +0000 UTC m=+1028.447135684" watchObservedRunningTime="2026-02-02 10:55:41.928393894 +0000 UTC m=+1028.946734000" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.934054 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.952125 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.952203 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j66\" (UniqueName: \"kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.952231 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.952275 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxcvr\" (UniqueName: \"kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.952300 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.953172 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.953217 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.953271 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.955429 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.972209 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.982745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.982817 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.984168 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cdcb9"] Feb 02 10:55:41 crc kubenswrapper[4901]: I0202 10:55:41.997995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxcvr\" (UniqueName: \"kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr\") pod \"placement-db-sync-xh5rm\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:42 crc kubenswrapper[4901]: I0202 10:55:42.025716 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:55:42 crc kubenswrapper[4901]: I0202 10:55:42.028665 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:42 crc kubenswrapper[4901]: I0202 10:55:42.039971 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054127 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054187 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054221 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054258 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054286 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054327 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054365 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j66\" (UniqueName: \"kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054385 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gf2\" (UniqueName: \"kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.054422 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.060129 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.070405 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.070635 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-2p24z"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.084332 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j66\" (UniqueName: \"kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66\") pod \"barbican-db-sync-cdcb9\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.090710 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.104692 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.107041 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.110108 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.110305 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dqjmw" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.110451 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.110618 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.111906 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.127184 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.130465 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.134759 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.135504 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.137393 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156277 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156353 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156403 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156450 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156508 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gf2\" (UniqueName: \"kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.156586 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.158362 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.158379 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.158391 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.159147 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.159201 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.178717 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.179192 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gf2\" (UniqueName: \"kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2\") pod \"dnsmasq-dns-56df8fb6b7-xnq2r\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.179357 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh5rm" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.214661 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-spsdj"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.222815 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.230387 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hcpj4"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258421 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcn2\" (UniqueName: \"kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258490 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258523 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42bp\" (UniqueName: \"kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258555 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258625 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258672 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258707 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.258772 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259072 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259227 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259269 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259306 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259455 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259613 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.259691 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.301108 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.353139 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361074 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcn2\" (UniqueName: \"kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361119 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361150 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l42bp\" (UniqueName: \"kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361184 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361213 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361244 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361265 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361304 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361330 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361383 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361419 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361463 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361490 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361531 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361592 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.361623 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.362361 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.362347 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.362554 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.362706 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.362710 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.363100 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.367497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.369015 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.370769 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.371323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.372941 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="dnsmasq-dns" containerID="cri-o://02f0f230e214f7fda328fb024dfe131ceab0bdfdd0d8837f4d8d1466ba9ccbdd" gracePeriod=10 Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.375125 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.376931 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.378729 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.383669 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.384848 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42bp\" (UniqueName: \"kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.386417 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcn2\" (UniqueName: \"kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.403345 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.409265 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.428334 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:42.446821 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:43.382700 4901 generic.go:334] "Generic (PLEG): container finished" podID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerID="02f0f230e214f7fda328fb024dfe131ceab0bdfdd0d8837f4d8d1466ba9ccbdd" exitCode=0 Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:43.382771 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" event={"ID":"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07","Type":"ContainerDied","Data":"02f0f230e214f7fda328fb024dfe131ceab0bdfdd0d8837f4d8d1466ba9ccbdd"} Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:43.695384 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421ae8ba-9f7c-461b-b757-f2ffccc8284c" path="/var/lib/kubelet/pods/421ae8ba-9f7c-461b-b757-f2ffccc8284c/volumes" Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:43.867777 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:44.002367 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:45 crc kubenswrapper[4901]: W0202 10:55:44.853983 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9cf3153_081c_42c9_a525_354ccaca7abd.slice/crio-30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4 WatchSource:0}: Error finding container 30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4: Status 404 returned error can't find the container with id 30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4 Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:45.398216 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hcpj4" event={"ID":"cccec66b-6bb1-4799-9385-73a33d1cacec","Type":"ContainerStarted","Data":"a0011b3a125318df64e2ede5681a7b6ea74e74dd0b5f0ace3d187e256ace5241"} Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:45.399456 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spsdj" event={"ID":"d9cf3153-081c-42c9-a525-354ccaca7abd","Type":"ContainerStarted","Data":"30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4"} Feb 02 10:55:45 crc kubenswrapper[4901]: I0202 10:55:45.401467 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" event={"ID":"674534db-de83-484c-9876-6eef192987b6","Type":"ContainerStarted","Data":"775ce07c7848d17505bd860bccf0829e73319aa31bcf3ca333d684191501daa7"} Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.054473 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.227195 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2hz6j"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.305172 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.410912 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hz6j" event={"ID":"9dc5e40d-ef90-4040-a247-114b55e0efa1","Type":"ContainerStarted","Data":"b27295f11cc7d6ec09fe98abae9dc2a51d3ac0bc2653a0fd1919338470956ab8"} Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.412352 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" event={"ID":"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07","Type":"ContainerDied","Data":"5931acc786b66d4600b7da6bdaf1966e56615d20fae93e1f8cdae234761b8c6d"} Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.412390 4901 scope.go:117] "RemoveContainer" containerID="02f0f230e214f7fda328fb024dfe131ceab0bdfdd0d8837f4d8d1466ba9ccbdd" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.412528 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-tttj2" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.431127 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spsdj" event={"ID":"d9cf3153-081c-42c9-a525-354ccaca7abd","Type":"ContainerStarted","Data":"c719e4f31aa449f35965a9891a647d8f02ad348c38bfb40c74f4ae6985502644"} Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.449059 4901 generic.go:334] "Generic (PLEG): container finished" podID="674534db-de83-484c-9876-6eef192987b6" containerID="478078b81d3a0673bd3a1a44619cd126d0a23b4d56057971a408017fa6e5f1b0" exitCode=0 Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.449113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" event={"ID":"674534db-de83-484c-9876-6eef192987b6","Type":"ContainerDied","Data":"478078b81d3a0673bd3a1a44619cd126d0a23b4d56057971a408017fa6e5f1b0"} Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.450828 4901 scope.go:117] "RemoveContainer" containerID="7c75be964da78ef688ce27fa517630a0cb0c608495a5510b1e1cbf3e1ca7a518" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.470078 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-spsdj" podStartSLOduration=6.4700519960000005 podStartE2EDuration="6.470051996s" podCreationTimestamp="2026-02-02 10:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:46.461671457 +0000 UTC m=+1033.480011563" watchObservedRunningTime="2026-02-02 10:55:46.470051996 +0000 UTC m=+1033.488392092" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.491650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.491803 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.492151 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.492233 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.492293 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4lp\" (UniqueName: \"kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.492348 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config\") pod \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\" (UID: \"e2b8b88a-99e5-4c15-b293-5f2fa80f0a07\") " Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.534325 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp" (OuterVolumeSpecName: "kube-api-access-ms4lp") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "kube-api-access-ms4lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.574899 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.590151 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xh5rm"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.597550 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4lp\" (UniqueName: \"kubernetes.io/projected/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-kube-api-access-ms4lp\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.597781 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.607481 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.618136 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.628810 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h9fw8"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.638692 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:55:46 crc kubenswrapper[4901]: W0202 10:55:46.642516 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302af226_4326_48ef_bef0_02fab3943dbe.slice/crio-f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981 WatchSource:0}: Error finding container f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981: Status 404 returned error can't find the container with id f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981 Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.642723 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.643078 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config" (OuterVolumeSpecName: "config") pod "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" (UID: "e2b8b88a-99e5-4c15-b293-5f2fa80f0a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.687903 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cdcb9"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.698798 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.698831 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.698843 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.698853 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.698867 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.722413 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:46 crc kubenswrapper[4901]: W0202 10:55:46.746956 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ea5f918_b91c_4711_bb7a_83b78bb133aa.slice/crio-5b3fa446f4a5c7f000b8759d66b8d9ac058149a75aea42288a2e0dd7381f3f06 WatchSource:0}: Error finding container 5b3fa446f4a5c7f000b8759d66b8d9ac058149a75aea42288a2e0dd7381f3f06: Status 404 returned error can't find the container with id 5b3fa446f4a5c7f000b8759d66b8d9ac058149a75aea42288a2e0dd7381f3f06 Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.805266 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:46 crc kubenswrapper[4901]: I0202 10:55:46.839014 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-tttj2"] Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.054117 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231130 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231196 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231247 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231295 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4982\" (UniqueName: \"kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231339 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.231360 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb\") pod \"674534db-de83-484c-9876-6eef192987b6\" (UID: \"674534db-de83-484c-9876-6eef192987b6\") " Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.251588 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982" (OuterVolumeSpecName: "kube-api-access-k4982") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "kube-api-access-k4982". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.296885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.307480 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.330462 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.333219 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config" (OuterVolumeSpecName: "config") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.334594 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.344877 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4982\" (UniqueName: \"kubernetes.io/projected/674534db-de83-484c-9876-6eef192987b6-kube-api-access-k4982\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.344954 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.344969 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.344980 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.396621 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "674534db-de83-484c-9876-6eef192987b6" (UID: "674534db-de83-484c-9876-6eef192987b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.449961 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/674534db-de83-484c-9876-6eef192987b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.463320 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdcb9" event={"ID":"6995b916-7b6d-4b5e-8284-8b07fc09be1c","Type":"ContainerStarted","Data":"b82747fa4fbc1398cfaa348db212aee1128748fed7516b400411083bac55eeaf"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.469739 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerStarted","Data":"006d319c4cf4896605ad00f215c0639675357bb2439be9cb6294d7308674c8a9"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.472474 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh5rm" event={"ID":"302af226-4326-48ef-bef0-02fab3943dbe","Type":"ContainerStarted","Data":"f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.477738 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h9fw8" event={"ID":"4426b588-ce3e-4184-bc0b-82f17522dd01","Type":"ContainerStarted","Data":"befca0c3509f895cde608403ea7f89e7f0d5c37633d6304f56f765dd55b33a99"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.477782 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h9fw8" event={"ID":"4426b588-ce3e-4184-bc0b-82f17522dd01","Type":"ContainerStarted","Data":"1dce3bd286fddefafea33910d9c7f4ebad263a73b0d998833927e241b5021775"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.495345 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" event={"ID":"674534db-de83-484c-9876-6eef192987b6","Type":"ContainerDied","Data":"775ce07c7848d17505bd860bccf0829e73319aa31bcf3ca333d684191501daa7"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.495420 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vtqqt" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.495444 4901 scope.go:117] "RemoveContainer" containerID="478078b81d3a0673bd3a1a44619cd126d0a23b4d56057971a408017fa6e5f1b0" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.508639 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerStarted","Data":"5b3fa446f4a5c7f000b8759d66b8d9ac058149a75aea42288a2e0dd7381f3f06"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.510470 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-h9fw8" podStartSLOduration=6.510445057 podStartE2EDuration="6.510445057s" podCreationTimestamp="2026-02-02 10:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:47.499381471 +0000 UTC m=+1034.517721557" watchObservedRunningTime="2026-02-02 10:55:47.510445057 +0000 UTC m=+1034.528785153" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.525261 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" event={"ID":"92d9658c-9dc4-466c-b261-dba41f7418ae","Type":"ContainerStarted","Data":"40b9a40e4e0e6ad711bbb16a145fbdc74b7f835531d5b49d63da4122bd39b81f"} Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.639718 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.669269 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vtqqt"] Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.719238 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674534db-de83-484c-9876-6eef192987b6" path="/var/lib/kubelet/pods/674534db-de83-484c-9876-6eef192987b6/volumes" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.719854 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" path="/var/lib/kubelet/pods/e2b8b88a-99e5-4c15-b293-5f2fa80f0a07/volumes" Feb 02 10:55:47 crc kubenswrapper[4901]: I0202 10:55:47.776806 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.546842 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerStarted","Data":"a7ebc0526473e5d9495b2410578ff6737ce5c1b87fec7279e37b57a4495ce7a1"} Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.556394 4901 generic.go:334] "Generic (PLEG): container finished" podID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerID="3ecdb331a985384caae0b085ed3d3fd1d860b4bca182252d53e5dbd54e420509" exitCode=0 Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.556472 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" event={"ID":"92d9658c-9dc4-466c-b261-dba41f7418ae","Type":"ContainerDied","Data":"3ecdb331a985384caae0b085ed3d3fd1d860b4bca182252d53e5dbd54e420509"} Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.556520 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" event={"ID":"92d9658c-9dc4-466c-b261-dba41f7418ae","Type":"ContainerStarted","Data":"6b3b89ac056ad9f26282eb444590aecfab59a1f9dda66e66ec6ca767e24e2584"} Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.556598 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.559248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerStarted","Data":"e94ec4d71ad313e5522fbf02b8c5197dff71f9ce9a22d75cb212b1e964f15a4f"} Feb 02 10:55:48 crc kubenswrapper[4901]: I0202 10:55:48.579554 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" podStartSLOduration=7.579529636 podStartE2EDuration="7.579529636s" podCreationTimestamp="2026-02-02 10:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:48.579152666 +0000 UTC m=+1035.597492752" watchObservedRunningTime="2026-02-02 10:55:48.579529636 +0000 UTC m=+1035.597869732" Feb 02 10:55:49 crc kubenswrapper[4901]: I0202 10:55:49.578215 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-log" containerID="cri-o://a7ebc0526473e5d9495b2410578ff6737ce5c1b87fec7279e37b57a4495ce7a1" gracePeriod=30 Feb 02 10:55:49 crc kubenswrapper[4901]: I0202 10:55:49.579238 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerStarted","Data":"7d94416932dcc7f200d8973d89487f2a381d9964253b4c52319d62c9950c0989"} Feb 02 10:55:49 crc kubenswrapper[4901]: I0202 10:55:49.579598 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-httpd" containerID="cri-o://7d94416932dcc7f200d8973d89487f2a381d9964253b4c52319d62c9950c0989" gracePeriod=30 Feb 02 10:55:49 crc kubenswrapper[4901]: I0202 10:55:49.587852 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerStarted","Data":"9ca6a8e73ddda95141efb8d2d25a475e85890135cb5ce58a05fe2671dda46259"} Feb 02 10:55:49 crc kubenswrapper[4901]: I0202 10:55:49.616238 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.616213904 podStartE2EDuration="9.616213904s" podCreationTimestamp="2026-02-02 10:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:49.606225884 +0000 UTC m=+1036.624565980" watchObservedRunningTime="2026-02-02 10:55:49.616213904 +0000 UTC m=+1036.634554010" Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.604659 4901 generic.go:334] "Generic (PLEG): container finished" podID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerID="7d94416932dcc7f200d8973d89487f2a381d9964253b4c52319d62c9950c0989" exitCode=0 Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.605151 4901 generic.go:334] "Generic (PLEG): container finished" podID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerID="a7ebc0526473e5d9495b2410578ff6737ce5c1b87fec7279e37b57a4495ce7a1" exitCode=143 Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.605273 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerDied","Data":"7d94416932dcc7f200d8973d89487f2a381d9964253b4c52319d62c9950c0989"} Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.605316 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerDied","Data":"a7ebc0526473e5d9495b2410578ff6737ce5c1b87fec7279e37b57a4495ce7a1"} Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.608258 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerStarted","Data":"a60e364b19edb2e2512f230cd9968c217ff9bed93177ad4ab522fc8e3b482580"} Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.608417 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-log" containerID="cri-o://9ca6a8e73ddda95141efb8d2d25a475e85890135cb5ce58a05fe2671dda46259" gracePeriod=30 Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.608591 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-httpd" containerID="cri-o://a60e364b19edb2e2512f230cd9968c217ff9bed93177ad4ab522fc8e3b482580" gracePeriod=30 Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.616185 4901 generic.go:334] "Generic (PLEG): container finished" podID="d9cf3153-081c-42c9-a525-354ccaca7abd" containerID="c719e4f31aa449f35965a9891a647d8f02ad348c38bfb40c74f4ae6985502644" exitCode=0 Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.616232 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spsdj" event={"ID":"d9cf3153-081c-42c9-a525-354ccaca7abd","Type":"ContainerDied","Data":"c719e4f31aa449f35965a9891a647d8f02ad348c38bfb40c74f4ae6985502644"} Feb 02 10:55:50 crc kubenswrapper[4901]: I0202 10:55:50.644659 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.644635236 podStartE2EDuration="10.644635236s" podCreationTimestamp="2026-02-02 10:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:50.64284499 +0000 UTC m=+1037.661185086" watchObservedRunningTime="2026-02-02 10:55:50.644635236 +0000 UTC m=+1037.662975332" Feb 02 10:55:51 crc kubenswrapper[4901]: I0202 10:55:51.631658 4901 generic.go:334] "Generic (PLEG): container finished" podID="0e983416-5a7a-467d-8056-6b5d38887729" containerID="a60e364b19edb2e2512f230cd9968c217ff9bed93177ad4ab522fc8e3b482580" exitCode=0 Feb 02 10:55:51 crc kubenswrapper[4901]: I0202 10:55:51.632125 4901 generic.go:334] "Generic (PLEG): container finished" podID="0e983416-5a7a-467d-8056-6b5d38887729" containerID="9ca6a8e73ddda95141efb8d2d25a475e85890135cb5ce58a05fe2671dda46259" exitCode=143 Feb 02 10:55:51 crc kubenswrapper[4901]: I0202 10:55:51.631725 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerDied","Data":"a60e364b19edb2e2512f230cd9968c217ff9bed93177ad4ab522fc8e3b482580"} Feb 02 10:55:51 crc kubenswrapper[4901]: I0202 10:55:51.632518 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerDied","Data":"9ca6a8e73ddda95141efb8d2d25a475e85890135cb5ce58a05fe2671dda46259"} Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.355711 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.401170 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.405939 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.432765 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.433022 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" containerID="cri-o://8e4489d032d6dbc762715b387bdf17e79591484090506cc386c55a253d4d7444" gracePeriod=10 Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496631 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496780 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496834 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496887 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496906 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496965 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.496983 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.497002 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.497046 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.497066 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvbz\" (UniqueName: \"kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz\") pod \"d9cf3153-081c-42c9-a525-354ccaca7abd\" (UID: \"d9cf3153-081c-42c9-a525-354ccaca7abd\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.497117 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.497140 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgcn2\" (UniqueName: \"kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2\") pod \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\" (UID: \"8ea5f918-b91c-4711-bb7a-83b78bb133aa\") " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.498499 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.501614 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs" (OuterVolumeSpecName: "logs") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.507362 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts" (OuterVolumeSpecName: "scripts") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.508467 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.508643 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz" (OuterVolumeSpecName: "kube-api-access-9fvbz") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "kube-api-access-9fvbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.510962 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2" (OuterVolumeSpecName: "kube-api-access-mgcn2") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "kube-api-access-mgcn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.512808 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.513683 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.518805 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts" (OuterVolumeSpecName: "scripts") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.563731 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data" (OuterVolumeSpecName: "config-data") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.599922 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.599956 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.599966 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.599977 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600006 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600017 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvbz\" (UniqueName: \"kubernetes.io/projected/d9cf3153-081c-42c9-a525-354ccaca7abd-kube-api-access-9fvbz\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600029 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600037 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgcn2\" (UniqueName: \"kubernetes.io/projected/8ea5f918-b91c-4711-bb7a-83b78bb133aa-kube-api-access-mgcn2\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600045 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ea5f918-b91c-4711-bb7a-83b78bb133aa-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.600052 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.608675 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data" (OuterVolumeSpecName: "config-data") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.627606 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.630590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ea5f918-b91c-4711-bb7a-83b78bb133aa" (UID: "8ea5f918-b91c-4711-bb7a-83b78bb133aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.655623 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9cf3153-081c-42c9-a525-354ccaca7abd" (UID: "d9cf3153-081c-42c9-a525-354ccaca7abd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.659134 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-spsdj" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.659355 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-spsdj" event={"ID":"d9cf3153-081c-42c9-a525-354ccaca7abd","Type":"ContainerDied","Data":"30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4"} Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.659436 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30b315d6b9bf07f2c981791a57698b164ec4684dd9c8dd35b5ff6ab4366744e4" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.663746 4901 generic.go:334] "Generic (PLEG): container finished" podID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerID="8e4489d032d6dbc762715b387bdf17e79591484090506cc386c55a253d4d7444" exitCode=0 Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.663817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" event={"ID":"a32397c5-9ffc-4b59-abac-4376cfb81d4a","Type":"ContainerDied","Data":"8e4489d032d6dbc762715b387bdf17e79591484090506cc386c55a253d4d7444"} Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.670082 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ea5f918-b91c-4711-bb7a-83b78bb133aa","Type":"ContainerDied","Data":"5b3fa446f4a5c7f000b8759d66b8d9ac058149a75aea42288a2e0dd7381f3f06"} Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.670142 4901 scope.go:117] "RemoveContainer" containerID="7d94416932dcc7f200d8973d89487f2a381d9964253b4c52319d62c9950c0989" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.670181 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.673612 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.702790 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9cf3153-081c-42c9-a525-354ccaca7abd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.703297 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.703370 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.703430 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea5f918-b91c-4711-bb7a-83b78bb133aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.703485 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.783624 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.799704 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.814639 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815069 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cf3153-081c-42c9-a525-354ccaca7abd" containerName="keystone-bootstrap" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815082 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cf3153-081c-42c9-a525-354ccaca7abd" containerName="keystone-bootstrap" Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815101 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-httpd" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815107 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-httpd" Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815129 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="dnsmasq-dns" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815135 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="dnsmasq-dns" Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815150 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="init" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815159 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="init" Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815170 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674534db-de83-484c-9876-6eef192987b6" containerName="init" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815179 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="674534db-de83-484c-9876-6eef192987b6" containerName="init" Feb 02 10:55:52 crc kubenswrapper[4901]: E0202 10:55:52.815193 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-log" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815199 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-log" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815356 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b8b88a-99e5-4c15-b293-5f2fa80f0a07" containerName="dnsmasq-dns" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815365 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-httpd" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815379 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="674534db-de83-484c-9876-6eef192987b6" containerName="init" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815387 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" containerName="glance-log" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.815397 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cf3153-081c-42c9-a525-354ccaca7abd" containerName="keystone-bootstrap" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.816313 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.820638 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.822056 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.824110 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.854954 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-spsdj"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.904866 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-spsdj"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.917492 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.917624 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.917684 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.917844 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.919602 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.919802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgrd\" (UniqueName: \"kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.920072 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.920117 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.958709 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8v2wt"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.961018 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.967461 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8v2wt"] Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.969423 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.969535 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nx7b8" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.969733 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.969763 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:55:52 crc kubenswrapper[4901]: I0202 10:55:52.970176 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023494 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgrd\" (UniqueName: \"kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023621 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023731 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023777 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023805 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023830 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023856 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023890 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss646\" (UniqueName: \"kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.023971 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024055 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024163 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024446 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.024905 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.030425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.030550 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.030606 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.033332 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.044596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgrd\" (UniqueName: \"kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.059603 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126009 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126071 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126113 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss646\" (UniqueName: \"kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126149 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126184 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.126238 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.131090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.133205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.134196 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.135701 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.138210 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.148925 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss646\" (UniqueName: \"kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646\") pod \"keystone-bootstrap-8v2wt\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.154876 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.292325 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.691376 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea5f918-b91c-4711-bb7a-83b78bb133aa" path="/var/lib/kubelet/pods/8ea5f918-b91c-4711-bb7a-83b78bb133aa/volumes" Feb 02 10:55:53 crc kubenswrapper[4901]: I0202 10:55:53.692754 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cf3153-081c-42c9-a525-354ccaca7abd" path="/var/lib/kubelet/pods/d9cf3153-081c-42c9-a525-354ccaca7abd/volumes" Feb 02 10:55:55 crc kubenswrapper[4901]: I0202 10:55:55.009175 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 02 10:56:00 crc kubenswrapper[4901]: I0202 10:56:00.010260 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.204272 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387442 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387516 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387627 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l42bp\" (UniqueName: \"kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387699 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387748 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387774 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.387835 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs\") pod \"0e983416-5a7a-467d-8056-6b5d38887729\" (UID: \"0e983416-5a7a-467d-8056-6b5d38887729\") " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.388899 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs" (OuterVolumeSpecName: "logs") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.395694 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.399111 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp" (OuterVolumeSpecName: "kube-api-access-l42bp") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "kube-api-access-l42bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.416132 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.432302 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts" (OuterVolumeSpecName: "scripts") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.432383 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.448135 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data" (OuterVolumeSpecName: "config-data") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.462608 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e983416-5a7a-467d-8056-6b5d38887729" (UID: "0e983416-5a7a-467d-8056-6b5d38887729"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498068 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498099 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l42bp\" (UniqueName: \"kubernetes.io/projected/0e983416-5a7a-467d-8056-6b5d38887729-kube-api-access-l42bp\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498111 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498122 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498131 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498140 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e983416-5a7a-467d-8056-6b5d38887729-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498148 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e983416-5a7a-467d-8056-6b5d38887729-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.498198 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.517265 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.599490 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.816152 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e983416-5a7a-467d-8056-6b5d38887729","Type":"ContainerDied","Data":"e94ec4d71ad313e5522fbf02b8c5197dff71f9ce9a22d75cb212b1e964f15a4f"} Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.816255 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.881514 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.903486 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.913938 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:56:04 crc kubenswrapper[4901]: E0202 10:56:04.914538 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-httpd" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.914596 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-httpd" Feb 02 10:56:04 crc kubenswrapper[4901]: E0202 10:56:04.914660 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-log" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.914679 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-log" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.914921 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-httpd" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.914964 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e983416-5a7a-467d-8056-6b5d38887729" containerName="glance-log" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.916403 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.920069 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.920283 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:56:04 crc kubenswrapper[4901]: I0202 10:56:04.924233 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108579 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfq4v\" (UniqueName: \"kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108641 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108676 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108699 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108797 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108831 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.108937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211291 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211398 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211422 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211475 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211540 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfq4v\" (UniqueName: \"kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211574 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211599 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.211971 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.212337 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.212491 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.232021 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.234208 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.234361 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.234759 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.234821 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfq4v\" (UniqueName: \"kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.243076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.267530 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:56:05 crc kubenswrapper[4901]: I0202 10:56:05.705228 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e983416-5a7a-467d-8056-6b5d38887729" path="/var/lib/kubelet/pods/0e983416-5a7a-467d-8056-6b5d38887729/volumes" Feb 02 10:56:06 crc kubenswrapper[4901]: I0202 10:56:06.837618 4901 generic.go:334] "Generic (PLEG): container finished" podID="4426b588-ce3e-4184-bc0b-82f17522dd01" containerID="befca0c3509f895cde608403ea7f89e7f0d5c37633d6304f56f765dd55b33a99" exitCode=0 Feb 02 10:56:06 crc kubenswrapper[4901]: I0202 10:56:06.837722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h9fw8" event={"ID":"4426b588-ce3e-4184-bc0b-82f17522dd01","Type":"ContainerDied","Data":"befca0c3509f895cde608403ea7f89e7f0d5c37633d6304f56f765dd55b33a99"} Feb 02 10:56:07 crc kubenswrapper[4901]: I0202 10:56:07.838166 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:56:07 crc kubenswrapper[4901]: I0202 10:56:07.838647 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:07 crc kubenswrapper[4901]: I0202 10:56:07.838719 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:56:07 crc kubenswrapper[4901]: I0202 10:56:07.839675 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:56:07 crc kubenswrapper[4901]: I0202 10:56:07.839759 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50" gracePeriod=600 Feb 02 10:56:08 crc kubenswrapper[4901]: I0202 10:56:08.858267 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50" exitCode=0 Feb 02 10:56:08 crc kubenswrapper[4901]: I0202 10:56:08.858315 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50"} Feb 02 10:56:10 crc kubenswrapper[4901]: I0202 10:56:10.012546 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Feb 02 10:56:10 crc kubenswrapper[4901]: I0202 10:56:10.013464 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.309278 4901 scope.go:117] "RemoveContainer" containerID="a7ebc0526473e5d9495b2410578ff6737ce5c1b87fec7279e37b57a4495ce7a1" Feb 02 10:56:14 crc kubenswrapper[4901]: E0202 10:56:14.313671 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Feb 02 10:56:14 crc kubenswrapper[4901]: E0202 10:56:14.313924 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs7wj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-hcpj4_openstack(cccec66b-6bb1-4799-9385-73a33d1cacec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:56:14 crc kubenswrapper[4901]: E0202 10:56:14.315188 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-hcpj4" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.435031 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.443893 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.453922 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb\") pod \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454088 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle\") pod \"4426b588-ce3e-4184-bc0b-82f17522dd01\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454166 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config\") pod \"4426b588-ce3e-4184-bc0b-82f17522dd01\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454202 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc5rj\" (UniqueName: \"kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj\") pod \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454281 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc\") pod \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454347 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh8jt\" (UniqueName: \"kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt\") pod \"4426b588-ce3e-4184-bc0b-82f17522dd01\" (UID: \"4426b588-ce3e-4184-bc0b-82f17522dd01\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config\") pod \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.454447 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb\") pod \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\" (UID: \"a32397c5-9ffc-4b59-abac-4376cfb81d4a\") " Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.497587 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj" (OuterVolumeSpecName: "kube-api-access-bc5rj") pod "a32397c5-9ffc-4b59-abac-4376cfb81d4a" (UID: "a32397c5-9ffc-4b59-abac-4376cfb81d4a"). InnerVolumeSpecName "kube-api-access-bc5rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.498796 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt" (OuterVolumeSpecName: "kube-api-access-dh8jt") pod "4426b588-ce3e-4184-bc0b-82f17522dd01" (UID: "4426b588-ce3e-4184-bc0b-82f17522dd01"). InnerVolumeSpecName "kube-api-access-dh8jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.507546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4426b588-ce3e-4184-bc0b-82f17522dd01" (UID: "4426b588-ce3e-4184-bc0b-82f17522dd01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.528042 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config" (OuterVolumeSpecName: "config") pod "4426b588-ce3e-4184-bc0b-82f17522dd01" (UID: "4426b588-ce3e-4184-bc0b-82f17522dd01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.530339 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a32397c5-9ffc-4b59-abac-4376cfb81d4a" (UID: "a32397c5-9ffc-4b59-abac-4376cfb81d4a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.530879 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a32397c5-9ffc-4b59-abac-4376cfb81d4a" (UID: "a32397c5-9ffc-4b59-abac-4376cfb81d4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.550064 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a32397c5-9ffc-4b59-abac-4376cfb81d4a" (UID: "a32397c5-9ffc-4b59-abac-4376cfb81d4a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557452 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557486 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4426b588-ce3e-4184-bc0b-82f17522dd01-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557503 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc5rj\" (UniqueName: \"kubernetes.io/projected/a32397c5-9ffc-4b59-abac-4376cfb81d4a-kube-api-access-bc5rj\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557518 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557529 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh8jt\" (UniqueName: \"kubernetes.io/projected/4426b588-ce3e-4184-bc0b-82f17522dd01-kube-api-access-dh8jt\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557539 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.557551 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.570155 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config" (OuterVolumeSpecName: "config") pod "a32397c5-9ffc-4b59-abac-4376cfb81d4a" (UID: "a32397c5-9ffc-4b59-abac-4376cfb81d4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.660752 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a32397c5-9ffc-4b59-abac-4376cfb81d4a-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.914350 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" event={"ID":"a32397c5-9ffc-4b59-abac-4376cfb81d4a","Type":"ContainerDied","Data":"ddcdbae9c5acbd92bd22dc005aa5f62b4b6b8a8c0cac3a7912317f50176b4579"} Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.914448 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.928900 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h9fw8" event={"ID":"4426b588-ce3e-4184-bc0b-82f17522dd01","Type":"ContainerDied","Data":"1dce3bd286fddefafea33910d9c7f4ebad263a73b0d998833927e241b5021775"} Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.928944 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dce3bd286fddefafea33910d9c7f4ebad263a73b0d998833927e241b5021775" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.929008 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h9fw8" Feb 02 10:56:14 crc kubenswrapper[4901]: E0202 10:56:14.943607 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-hcpj4" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.964595 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:56:14 crc kubenswrapper[4901]: I0202 10:56:14.975650 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-vj5hj"] Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.014081 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-vj5hj" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.486665 4901 scope.go:117] "RemoveContainer" containerID="a60e364b19edb2e2512f230cd9968c217ff9bed93177ad4ab522fc8e3b482580" Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.491994 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.492191 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gbkdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2hz6j_openstack(9dc5e40d-ef90-4040-a247-114b55e0efa1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.493796 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2hz6j" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.721014 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" path="/var/lib/kubelet/pods/a32397c5-9ffc-4b59-abac-4376cfb81d4a/volumes" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.722444 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.722827 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4426b588-ce3e-4184-bc0b-82f17522dd01" containerName="neutron-db-sync" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.722844 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4426b588-ce3e-4184-bc0b-82f17522dd01" containerName="neutron-db-sync" Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.722853 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="init" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.722859 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="init" Feb 02 10:56:15 crc kubenswrapper[4901]: E0202 10:56:15.722880 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.722886 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.727305 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4426b588-ce3e-4184-bc0b-82f17522dd01" containerName="neutron-db-sync" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.727365 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32397c5-9ffc-4b59-abac-4376cfb81d4a" containerName="dnsmasq-dns" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.732738 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.733663 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.744832 4901 scope.go:117] "RemoveContainer" containerID="9ca6a8e73ddda95141efb8d2d25a475e85890135cb5ce58a05fe2671dda46259" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785187 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785841 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlcnx\" (UniqueName: \"kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785877 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785901 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.785928 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.796868 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.800924 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.805389 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-z5cj9" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.805542 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.805728 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.808729 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.847057 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.857517 4901 scope.go:117] "RemoveContainer" containerID="3ee71c5df77ed5308f20e56cd0da57bad1b0442e13aee27a55466b956169f8c4" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889767 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889828 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlcnx\" (UniqueName: \"kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889859 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889885 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889904 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.889952 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.891192 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.892048 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.893363 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.894117 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.894832 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.927399 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlcnx\" (UniqueName: \"kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx\") pod \"dnsmasq-dns-6b7b667979-ccq56\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.956335 4901 scope.go:117] "RemoveContainer" containerID="8e4489d032d6dbc762715b387bdf17e79591484090506cc386c55a253d4d7444" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.979001 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdcb9" event={"ID":"6995b916-7b6d-4b5e-8284-8b07fc09be1c","Type":"ContainerStarted","Data":"8fd3da1c1853f0c537007d818cb99d6d002fc9920c2d2abbd4eee4162558f50b"} Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.992444 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.992758 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.992869 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmn4\" (UniqueName: \"kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.993679 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:15 crc kubenswrapper[4901]: I0202 10:56:15.993810 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.040117 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cdcb9" podStartSLOduration=7.429221159 podStartE2EDuration="35.040071023s" podCreationTimestamp="2026-02-02 10:55:41 +0000 UTC" firstStartedPulling="2026-02-02 10:55:46.721228154 +0000 UTC m=+1033.739568250" lastFinishedPulling="2026-02-02 10:56:14.332078018 +0000 UTC m=+1061.350418114" observedRunningTime="2026-02-02 10:56:16.019519549 +0000 UTC m=+1063.037859655" watchObservedRunningTime="2026-02-02 10:56:16.040071023 +0000 UTC m=+1063.058411119" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.048850 4901 scope.go:117] "RemoveContainer" containerID="cd3cb5a398323c5815b4d0172c885d14a29c70c4af4229133ef5b8f24cb36439" Feb 02 10:56:16 crc kubenswrapper[4901]: E0202 10:56:16.049511 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2hz6j" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.097920 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.098099 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.098229 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.098378 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.098504 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmn4\" (UniqueName: \"kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.109528 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.110191 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.110264 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.112642 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.115540 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.121374 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmn4\" (UniqueName: \"kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4\") pod \"neutron-7f784b5584-t7x4s\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.163269 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.275067 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8v2wt"] Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.291763 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.521347 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.675608 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:56:16 crc kubenswrapper[4901]: W0202 10:56:16.693916 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6764f5_9f09_4b5a_bc5f_7d212c713ae8.slice/crio-d79d0b09b88c431e3717b77251bca08446d1af35921816de8c3521a006a2bfc9 WatchSource:0}: Error finding container d79d0b09b88c431e3717b77251bca08446d1af35921816de8c3521a006a2bfc9: Status 404 returned error can't find the container with id d79d0b09b88c431e3717b77251bca08446d1af35921816de8c3521a006a2bfc9 Feb 02 10:56:16 crc kubenswrapper[4901]: I0202 10:56:16.929616 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.042886 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerStarted","Data":"21d420e67e03baa42b605453c70ede0e38fa51c6be0019b4aa5c08b0672700e1"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.046422 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh5rm" event={"ID":"302af226-4326-48ef-bef0-02fab3943dbe","Type":"ContainerStarted","Data":"0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.072162 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xh5rm" podStartSLOduration=8.391340134 podStartE2EDuration="36.072140836s" podCreationTimestamp="2026-02-02 10:55:41 +0000 UTC" firstStartedPulling="2026-02-02 10:55:46.647519882 +0000 UTC m=+1033.665859978" lastFinishedPulling="2026-02-02 10:56:14.328320584 +0000 UTC m=+1061.346660680" observedRunningTime="2026-02-02 10:56:17.062398352 +0000 UTC m=+1064.080738458" watchObservedRunningTime="2026-02-02 10:56:17.072140836 +0000 UTC m=+1064.090480932" Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.080231 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8v2wt" event={"ID":"093b2698-02da-479f-8d78-59e99a88d7c9","Type":"ContainerStarted","Data":"ccbb00648e6b95f35fcdd653170188aa391581836312ac690b76d08b63da1456"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.080292 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8v2wt" event={"ID":"093b2698-02da-479f-8d78-59e99a88d7c9","Type":"ContainerStarted","Data":"a4064f94028cf5d10ff075aa3c3ef1b6e79f4eed4d4c154158dfc103cd23f95e"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.089321 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.112141 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8v2wt" podStartSLOduration=25.112124825 podStartE2EDuration="25.112124825s" podCreationTimestamp="2026-02-02 10:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:17.108691669 +0000 UTC m=+1064.127031765" watchObservedRunningTime="2026-02-02 10:56:17.112124825 +0000 UTC m=+1064.130464921" Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.116529 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerStarted","Data":"c2a2b6b73485e9f6de28645972572f5e5d598453016450f03f4fad0a8a13eee5"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.124776 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerStarted","Data":"2f04a886f4dddb757f789beba032ee796d8413044da9768068034f3e83f7a2bb"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.138411 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerStarted","Data":"d79d0b09b88c431e3717b77251bca08446d1af35921816de8c3521a006a2bfc9"} Feb 02 10:56:17 crc kubenswrapper[4901]: I0202 10:56:17.142085 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" event={"ID":"f8829fb9-be39-448c-9f96-cfc98534248a","Type":"ContainerStarted","Data":"a4a2c6c99afc5c27df2d2575151d3e6ab9f6fe620c539f8931f69c934dd97e63"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.162832 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerStarted","Data":"e776cbe14f8d58e99e75781ca4a73a8e901109010ae260a63f523ec58fe89672"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.166058 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerStarted","Data":"2c8f2fdc80d2f7dcfdbc3cb555e2560c637c39277353ad397d669d7633a69a74"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.166091 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerStarted","Data":"299510ae2a67239f6106a2c52c9143358ea95db027b4490f1c10b74d9ffaa9c6"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.166120 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.169035 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" event={"ID":"f8829fb9-be39-448c-9f96-cfc98534248a","Type":"ContainerDied","Data":"16ff4ede99b1b57c3e05140e58964789873adf3691a6320ab695e43ac3b48ae1"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.169097 4901 generic.go:334] "Generic (PLEG): container finished" podID="f8829fb9-be39-448c-9f96-cfc98534248a" containerID="16ff4ede99b1b57c3e05140e58964789873adf3691a6320ab695e43ac3b48ae1" exitCode=0 Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.172074 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerStarted","Data":"f4e44261b0cf9b669b7896d5e2708bb47ca0f96ebd2f61c10f9eebe8dc79943a"} Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.222688 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f784b5584-t7x4s" podStartSLOduration=3.222668439 podStartE2EDuration="3.222668439s" podCreationTimestamp="2026-02-02 10:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:18.199056339 +0000 UTC m=+1065.217396435" watchObservedRunningTime="2026-02-02 10:56:18.222668439 +0000 UTC m=+1065.241008535" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.389115 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.390632 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.394473 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.394758 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.406722 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.429957 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gszjm\" (UniqueName: \"kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430019 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430038 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430114 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430161 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.430190 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.533111 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gszjm\" (UniqueName: \"kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535329 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535549 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.535672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.542731 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.543351 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.543812 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.550083 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.550631 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gszjm\" (UniqueName: \"kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.551077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.553148 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs\") pod \"neutron-6f4d4785b9-9m2tl\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:18 crc kubenswrapper[4901]: I0202 10:56:18.745858 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:19 crc kubenswrapper[4901]: I0202 10:56:19.191695 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" event={"ID":"f8829fb9-be39-448c-9f96-cfc98534248a","Type":"ContainerStarted","Data":"1c8fd39bee50f2445e992bbd4a36665ce4181637e5c8385b6dea1225f6abc6a7"} Feb 02 10:56:19 crc kubenswrapper[4901]: I0202 10:56:19.205192 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerStarted","Data":"033e9456d7db7660d77f59316f11777e03f34259ff92a9d12a1750b1d52321c7"} Feb 02 10:56:19 crc kubenswrapper[4901]: I0202 10:56:19.211890 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:19 crc kubenswrapper[4901]: W0202 10:56:19.231974 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb97ea9e5_3049_4ba7_9cc1_2165d15a3746.slice/crio-81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e WatchSource:0}: Error finding container 81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e: Status 404 returned error can't find the container with id 81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e Feb 02 10:56:19 crc kubenswrapper[4901]: I0202 10:56:19.232799 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=27.232776183 podStartE2EDuration="27.232776183s" podCreationTimestamp="2026-02-02 10:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:19.22627609 +0000 UTC m=+1066.244616186" watchObservedRunningTime="2026-02-02 10:56:19.232776183 +0000 UTC m=+1066.251116269" Feb 02 10:56:19 crc kubenswrapper[4901]: I0202 10:56:19.243895 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerStarted","Data":"bbfe6147231cfcbd29282e9a4f6ed42803490aa7ffeb09eb13e27fe0e10eb483"} Feb 02 10:56:19 crc kubenswrapper[4901]: E0202 10:56:19.474752 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302af226_4326_48ef_bef0_02fab3943dbe.slice/crio-conmon-0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302af226_4326_48ef_bef0_02fab3943dbe.slice/crio-0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.255015 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerStarted","Data":"f43ac399bb39abd5da0adec25c3d3c907dd420250f8571799f416809bbc2ad9d"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.257020 4901 generic.go:334] "Generic (PLEG): container finished" podID="302af226-4326-48ef-bef0-02fab3943dbe" containerID="0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340" exitCode=0 Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.257068 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh5rm" event={"ID":"302af226-4326-48ef-bef0-02fab3943dbe","Type":"ContainerDied","Data":"0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.260020 4901 generic.go:334] "Generic (PLEG): container finished" podID="6995b916-7b6d-4b5e-8284-8b07fc09be1c" containerID="8fd3da1c1853f0c537007d818cb99d6d002fc9920c2d2abbd4eee4162558f50b" exitCode=0 Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.260087 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdcb9" event={"ID":"6995b916-7b6d-4b5e-8284-8b07fc09be1c","Type":"ContainerDied","Data":"8fd3da1c1853f0c537007d818cb99d6d002fc9920c2d2abbd4eee4162558f50b"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.262930 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerStarted","Data":"a49c3f0e054068f2a9fb809af959494d324bc38d409d0642f47ec06aafdac2ad"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.262963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerStarted","Data":"c370f81bb014213fa8f0ca4747e7a27481b3e911d5436628a3ac6d65de5cab15"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.262975 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerStarted","Data":"81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e"} Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.263402 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.263523 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.285504 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.285480092 podStartE2EDuration="16.285480092s" podCreationTimestamp="2026-02-02 10:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:20.281203545 +0000 UTC m=+1067.299543641" watchObservedRunningTime="2026-02-02 10:56:20.285480092 +0000 UTC m=+1067.303820188" Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.343696 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" podStartSLOduration=5.343641975 podStartE2EDuration="5.343641975s" podCreationTimestamp="2026-02-02 10:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:20.332396624 +0000 UTC m=+1067.350736720" watchObservedRunningTime="2026-02-02 10:56:20.343641975 +0000 UTC m=+1067.361982081" Feb 02 10:56:20 crc kubenswrapper[4901]: I0202 10:56:20.376424 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f4d4785b9-9m2tl" podStartSLOduration=2.376394574 podStartE2EDuration="2.376394574s" podCreationTimestamp="2026-02-02 10:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:20.374060395 +0000 UTC m=+1067.392400491" watchObservedRunningTime="2026-02-02 10:56:20.376394574 +0000 UTC m=+1067.394734670" Feb 02 10:56:22 crc kubenswrapper[4901]: I0202 10:56:22.284364 4901 generic.go:334] "Generic (PLEG): container finished" podID="093b2698-02da-479f-8d78-59e99a88d7c9" containerID="ccbb00648e6b95f35fcdd653170188aa391581836312ac690b76d08b63da1456" exitCode=0 Feb 02 10:56:22 crc kubenswrapper[4901]: I0202 10:56:22.284488 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8v2wt" event={"ID":"093b2698-02da-479f-8d78-59e99a88d7c9","Type":"ContainerDied","Data":"ccbb00648e6b95f35fcdd653170188aa391581836312ac690b76d08b63da1456"} Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.155431 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.155800 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.155813 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.155822 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.215333 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:23 crc kubenswrapper[4901]: I0202 10:56:23.241158 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.312865 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xh5rm" event={"ID":"302af226-4326-48ef-bef0-02fab3943dbe","Type":"ContainerDied","Data":"f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981"} Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.312922 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f577bfff49f2f8cf7a28f7ec9d73d9e51ae7fd231544838fd04011c218c57981" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.315252 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cdcb9" event={"ID":"6995b916-7b6d-4b5e-8284-8b07fc09be1c","Type":"ContainerDied","Data":"b82747fa4fbc1398cfaa348db212aee1128748fed7516b400411083bac55eeaf"} Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.315286 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82747fa4fbc1398cfaa348db212aee1128748fed7516b400411083bac55eeaf" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.333848 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8v2wt" event={"ID":"093b2698-02da-479f-8d78-59e99a88d7c9","Type":"ContainerDied","Data":"a4064f94028cf5d10ff075aa3c3ef1b6e79f4eed4d4c154158dfc103cd23f95e"} Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.333924 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4064f94028cf5d10ff075aa3c3ef1b6e79f4eed4d4c154158dfc103cd23f95e" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.427429 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh5rm" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.434657 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.446540 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.587232 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.587599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27j66\" (UniqueName: \"kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66\") pod \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.587740 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts\") pod \"302af226-4326-48ef-bef0-02fab3943dbe\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.587877 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.587989 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data\") pod \"302af226-4326-48ef-bef0-02fab3943dbe\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588132 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle\") pod \"302af226-4326-48ef-bef0-02fab3943dbe\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588249 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588367 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs\") pod \"302af226-4326-48ef-bef0-02fab3943dbe\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588526 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxcvr\" (UniqueName: \"kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr\") pod \"302af226-4326-48ef-bef0-02fab3943dbe\" (UID: \"302af226-4326-48ef-bef0-02fab3943dbe\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588691 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.588815 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data\") pod \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.589067 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss646\" (UniqueName: \"kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.589206 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle\") pod \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\" (UID: \"6995b916-7b6d-4b5e-8284-8b07fc09be1c\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.589317 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle\") pod \"093b2698-02da-479f-8d78-59e99a88d7c9\" (UID: \"093b2698-02da-479f-8d78-59e99a88d7c9\") " Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.591010 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs" (OuterVolumeSpecName: "logs") pod "302af226-4326-48ef-bef0-02fab3943dbe" (UID: "302af226-4326-48ef-bef0-02fab3943dbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.597238 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts" (OuterVolumeSpecName: "scripts") pod "302af226-4326-48ef-bef0-02fab3943dbe" (UID: "302af226-4326-48ef-bef0-02fab3943dbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.597297 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66" (OuterVolumeSpecName: "kube-api-access-27j66") pod "6995b916-7b6d-4b5e-8284-8b07fc09be1c" (UID: "6995b916-7b6d-4b5e-8284-8b07fc09be1c"). InnerVolumeSpecName "kube-api-access-27j66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.599260 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts" (OuterVolumeSpecName: "scripts") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.599396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.602749 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.603172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6995b916-7b6d-4b5e-8284-8b07fc09be1c" (UID: "6995b916-7b6d-4b5e-8284-8b07fc09be1c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.609018 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646" (OuterVolumeSpecName: "kube-api-access-ss646") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "kube-api-access-ss646". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.613719 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr" (OuterVolumeSpecName: "kube-api-access-dxcvr") pod "302af226-4326-48ef-bef0-02fab3943dbe" (UID: "302af226-4326-48ef-bef0-02fab3943dbe"). InnerVolumeSpecName "kube-api-access-dxcvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.653512 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "302af226-4326-48ef-bef0-02fab3943dbe" (UID: "302af226-4326-48ef-bef0-02fab3943dbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.662512 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data" (OuterVolumeSpecName: "config-data") pod "302af226-4326-48ef-bef0-02fab3943dbe" (UID: "302af226-4326-48ef-bef0-02fab3943dbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.670699 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.671462 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data" (OuterVolumeSpecName: "config-data") pod "093b2698-02da-479f-8d78-59e99a88d7c9" (UID: "093b2698-02da-479f-8d78-59e99a88d7c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.684980 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6995b916-7b6d-4b5e-8284-8b07fc09be1c" (UID: "6995b916-7b6d-4b5e-8284-8b07fc09be1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692685 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss646\" (UniqueName: \"kubernetes.io/projected/093b2698-02da-479f-8d78-59e99a88d7c9-kube-api-access-ss646\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692741 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692761 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692779 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692796 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27j66\" (UniqueName: \"kubernetes.io/projected/6995b916-7b6d-4b5e-8284-8b07fc09be1c-kube-api-access-27j66\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692808 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692820 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692833 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692845 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302af226-4326-48ef-bef0-02fab3943dbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692856 4901 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692870 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302af226-4326-48ef-bef0-02fab3943dbe-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692884 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxcvr\" (UniqueName: \"kubernetes.io/projected/302af226-4326-48ef-bef0-02fab3943dbe-kube-api-access-dxcvr\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692896 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/093b2698-02da-479f-8d78-59e99a88d7c9-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:24 crc kubenswrapper[4901]: I0202 10:56:24.692908 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6995b916-7b6d-4b5e-8284-8b07fc09be1c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.268300 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.269230 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.327230 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.342112 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.349204 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cdcb9" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.349270 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xh5rm" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.349270 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8v2wt" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.349648 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerStarted","Data":"cf3026123265208ba8e79765049e8a67efd3f711fdfd72fd0114774ac279846a"} Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.350678 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.350720 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.380903 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.381035 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.530943 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.611589 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:56:25 crc kubenswrapper[4901]: E0202 10:56:25.612027 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093b2698-02da-479f-8d78-59e99a88d7c9" containerName="keystone-bootstrap" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612046 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="093b2698-02da-479f-8d78-59e99a88d7c9" containerName="keystone-bootstrap" Feb 02 10:56:25 crc kubenswrapper[4901]: E0202 10:56:25.612075 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302af226-4326-48ef-bef0-02fab3943dbe" containerName="placement-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612083 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="302af226-4326-48ef-bef0-02fab3943dbe" containerName="placement-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: E0202 10:56:25.612103 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6995b916-7b6d-4b5e-8284-8b07fc09be1c" containerName="barbican-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612111 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6995b916-7b6d-4b5e-8284-8b07fc09be1c" containerName="barbican-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612293 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6995b916-7b6d-4b5e-8284-8b07fc09be1c" containerName="barbican-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612313 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="093b2698-02da-479f-8d78-59e99a88d7c9" containerName="keystone-bootstrap" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.612324 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="302af226-4326-48ef-bef0-02fab3943dbe" containerName="placement-db-sync" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.613239 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.618872 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.619113 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.619334 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.619406 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fxf28" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.619474 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.634166 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713594 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713679 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713738 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713776 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713802 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.713847 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjmk\" (UniqueName: \"kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.775608 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d7d985874-pzxvf"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.776912 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.782449 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.782718 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nx7b8" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.783087 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.783668 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.783796 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.783905 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.813551 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d7d985874-pzxvf"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.824804 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.824863 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.824908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.824941 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.824969 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjmk\" (UniqueName: \"kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.825028 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.825087 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.830801 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.847207 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.847444 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.850261 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjmk\" (UniqueName: \"kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.851374 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.853544 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.854257 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs\") pod \"placement-5fd59b56d-fwr5x\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.864759 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.866245 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.871403 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9fqtx" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.871778 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.871933 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.902802 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.926413 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-combined-ca-bundle\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.926684 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-internal-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.926808 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-credential-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.926890 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-fernet-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.927041 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-config-data\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.927118 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wrvh\" (UniqueName: \"kubernetes.io/projected/9c918cc4-c647-4e53-8800-27ea182ea861-kube-api-access-9wrvh\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.927192 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-scripts\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.927271 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-public-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.933762 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.935334 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.944258 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:25 crc kubenswrapper[4901]: I0202 10:56:25.954253 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.032360 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wrvh\" (UniqueName: \"kubernetes.io/projected/9c918cc4-c647-4e53-8800-27ea182ea861-kube-api-access-9wrvh\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054181 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-scripts\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054256 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-public-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054306 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054387 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-combined-ca-bundle\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054421 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054447 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-internal-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-credential-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054657 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-fernet-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054709 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054773 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054820 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjl5\" (UniqueName: \"kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.054939 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxhl\" (UniqueName: \"kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.055038 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.055084 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-config-data\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.071723 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.071779 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.083925 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-fernet-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.100263 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.115394 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-scripts\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.117683 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.124803 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="dnsmasq-dns" containerID="cri-o://1c8fd39bee50f2445e992bbd4a36665ce4181637e5c8385b6dea1225f6abc6a7" gracePeriod=10 Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.126532 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wrvh\" (UniqueName: \"kubernetes.io/projected/9c918cc4-c647-4e53-8800-27ea182ea861-kube-api-access-9wrvh\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.127067 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-public-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.127398 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.129421 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-internal-tls-certs\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.129448 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-credential-keys\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.137908 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-combined-ca-bundle\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.141409 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c918cc4-c647-4e53-8800-27ea182ea861-config-data\") pod \"keystone-7d7d985874-pzxvf\" (UID: \"9c918cc4-c647-4e53-8800-27ea182ea861\") " pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.178065 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.180101 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.180827 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.181036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182299 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182410 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjl5\" (UniqueName: \"kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182595 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxhl\" (UniqueName: \"kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182721 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182830 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.182913 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.183981 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.201391 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.206297 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.207172 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.209116 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.215128 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.216004 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.233654 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxhl\" (UniqueName: \"kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl\") pod \"barbican-keystone-listener-8474dc9c5d-ld7wj\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.249176 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.265525 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.280322 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.282650 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-558d599c79-dct6d"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.287428 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.307643 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7b4d58f5d-kkj8n"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.324583 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjl5\" (UniqueName: \"kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5\") pod \"barbican-worker-5db554c9b9-tv2zt\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.325659 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.352653 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b4d58f5d-kkj8n"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.378412 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558d599c79-dct6d"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.385330 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387303 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vph5q\" (UniqueName: \"kubernetes.io/projected/1644a512-eb6b-4023-9961-f42ff4bdbbe6-kube-api-access-vph5q\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387344 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387374 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-combined-ca-bundle\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387406 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387443 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387497 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sssgf\" (UniqueName: \"kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387532 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data-custom\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.387553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.389834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1644a512-eb6b-4023-9961-f42ff4bdbbe6-logs\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.389861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.401534 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.403522 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.407401 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.411777 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.415197 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.431445 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.491937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492049 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vph5q\" (UniqueName: \"kubernetes.io/projected/1644a512-eb6b-4023-9961-f42ff4bdbbe6-kube-api-access-vph5q\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492113 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492140 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-combined-ca-bundle\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492189 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492246 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492278 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data-custom\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492309 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492338 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492373 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nlwn\" (UniqueName: \"kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sssgf\" (UniqueName: \"kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492454 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492592 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492624 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data-custom\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492678 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492703 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c1d42a9-290c-4bff-b0cb-89507651dae4-logs\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr7l\" (UniqueName: \"kubernetes.io/projected/7c1d42a9-290c-4bff-b0cb-89507651dae4-kube-api-access-jxr7l\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492783 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492832 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1644a512-eb6b-4023-9961-f42ff4bdbbe6-logs\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.492878 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.494076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.501186 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.504024 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1644a512-eb6b-4023-9961-f42ff4bdbbe6-logs\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.505055 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.506198 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.512828 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.514154 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.514497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-config-data-custom\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.525080 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1644a512-eb6b-4023-9961-f42ff4bdbbe6-combined-ca-bundle\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.525375 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vph5q\" (UniqueName: \"kubernetes.io/projected/1644a512-eb6b-4023-9961-f42ff4bdbbe6-kube-api-access-vph5q\") pod \"barbican-worker-558d599c79-dct6d\" (UID: \"1644a512-eb6b-4023-9961-f42ff4bdbbe6\") " pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.545919 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sssgf\" (UniqueName: \"kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf\") pod \"dnsmasq-dns-848cf88cfc-vk7th\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596140 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596257 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596302 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c1d42a9-290c-4bff-b0cb-89507651dae4-logs\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596412 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr7l\" (UniqueName: \"kubernetes.io/projected/7c1d42a9-290c-4bff-b0cb-89507651dae4-kube-api-access-jxr7l\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596446 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596511 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596739 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data-custom\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.596821 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nlwn\" (UniqueName: \"kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.600257 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.600411 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c1d42a9-290c-4bff-b0cb-89507651dae4-logs\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.604977 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.607524 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.608236 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-combined-ca-bundle\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.608636 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.611596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data-custom\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.622950 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nlwn\" (UniqueName: \"kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn\") pod \"barbican-api-b6f6785c4-59xw6\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.623028 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c1d42a9-290c-4bff-b0cb-89507651dae4-config-data\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.642019 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr7l\" (UniqueName: \"kubernetes.io/projected/7c1d42a9-290c-4bff-b0cb-89507651dae4-kube-api-access-jxr7l\") pod \"barbican-keystone-listener-7b4d58f5d-kkj8n\" (UID: \"7c1d42a9-290c-4bff-b0cb-89507651dae4\") " pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.685659 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:56:26 crc kubenswrapper[4901]: W0202 10:56:26.692200 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46bccada_d49d_4bed_a7c7_e3ea2a64414a.slice/crio-cd83b9c155cc2efa77b419230cf3db17fa7ffe3db4d8ee0298a64656096e11ae WatchSource:0}: Error finding container cd83b9c155cc2efa77b419230cf3db17fa7ffe3db4d8ee0298a64656096e11ae: Status 404 returned error can't find the container with id cd83b9c155cc2efa77b419230cf3db17fa7ffe3db4d8ee0298a64656096e11ae Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.726312 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.741330 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558d599c79-dct6d" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.762166 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.768125 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.843729 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7ff445bb86-z9hcr"] Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.845547 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:26 crc kubenswrapper[4901]: I0202 10:56:26.858846 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ff445bb86-z9hcr"] Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.008170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-combined-ca-bundle\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.008869 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-public-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.008907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brcx\" (UniqueName: \"kubernetes.io/projected/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-kube-api-access-4brcx\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.008965 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-logs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.009009 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-config-data\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.009056 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-scripts\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.009085 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-internal-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111211 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-combined-ca-bundle\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111263 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-public-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111288 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brcx\" (UniqueName: \"kubernetes.io/projected/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-kube-api-access-4brcx\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111327 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-logs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111371 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-config-data\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111399 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-scripts\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.111510 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-internal-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.115261 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-logs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.138690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-public-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.141541 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-internal-tls-certs\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.143363 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-config-data\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.144077 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brcx\" (UniqueName: \"kubernetes.io/projected/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-kube-api-access-4brcx\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.144891 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-scripts\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.145268 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a5ab35-b3f0-4065-bcbe-5c784dd6d02c-combined-ca-bundle\") pod \"placement-7ff445bb86-z9hcr\" (UID: \"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c\") " pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.271612 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.429211 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hcpj4" event={"ID":"cccec66b-6bb1-4799-9385-73a33d1cacec","Type":"ContainerStarted","Data":"ed61e656b80a17f0b5b5f0d5c3a178ca9ce7d41172db50d2e265e1c1b96489f2"} Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.437811 4901 generic.go:334] "Generic (PLEG): container finished" podID="f8829fb9-be39-448c-9f96-cfc98534248a" containerID="1c8fd39bee50f2445e992bbd4a36665ce4181637e5c8385b6dea1225f6abc6a7" exitCode=0 Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.437913 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" event={"ID":"f8829fb9-be39-448c-9f96-cfc98534248a","Type":"ContainerDied","Data":"1c8fd39bee50f2445e992bbd4a36665ce4181637e5c8385b6dea1225f6abc6a7"} Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.457408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerStarted","Data":"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689"} Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.457897 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerStarted","Data":"cd83b9c155cc2efa77b419230cf3db17fa7ffe3db4d8ee0298a64656096e11ae"} Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.468322 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-hcpj4" podStartSLOduration=5.666291099 podStartE2EDuration="47.468304621s" podCreationTimestamp="2026-02-02 10:55:40 +0000 UTC" firstStartedPulling="2026-02-02 10:55:44.863810784 +0000 UTC m=+1031.882150880" lastFinishedPulling="2026-02-02 10:56:26.665824306 +0000 UTC m=+1073.684164402" observedRunningTime="2026-02-02 10:56:27.462088685 +0000 UTC m=+1074.480428791" watchObservedRunningTime="2026-02-02 10:56:27.468304621 +0000 UTC m=+1074.486644717" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.622667 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.658173 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.744740 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.745130 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.745186 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlcnx\" (UniqueName: \"kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.745209 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.745268 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.745341 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config\") pod \"f8829fb9-be39-448c-9f96-cfc98534248a\" (UID: \"f8829fb9-be39-448c-9f96-cfc98534248a\") " Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.776851 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx" (OuterVolumeSpecName: "kube-api-access-rlcnx") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "kube-api-access-rlcnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.786616 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.863291 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d7d985874-pzxvf"] Feb 02 10:56:27 crc kubenswrapper[4901]: I0202 10:56:27.873350 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlcnx\" (UniqueName: \"kubernetes.io/projected/f8829fb9-be39-448c-9f96-cfc98534248a-kube-api-access-rlcnx\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.166409 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config" (OuterVolumeSpecName: "config") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.183002 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.185297 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.185372 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.185396 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.198653 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.200295 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.213701 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b4d58f5d-kkj8n"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.224716 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558d599c79-dct6d"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.225966 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.228222 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8829fb9-be39-448c-9f96-cfc98534248a" (UID: "f8829fb9-be39-448c-9f96-cfc98534248a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.229235 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ff445bb86-z9hcr"] Feb 02 10:56:28 crc kubenswrapper[4901]: W0202 10:56:28.229309 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06feef03_ccd0_4fa8_8939_852df217d3a1.slice/crio-4dbb609ebaa299a4e43916e94be6fb6bd63d03b345698adc9669f7067825ce47 WatchSource:0}: Error finding container 4dbb609ebaa299a4e43916e94be6fb6bd63d03b345698adc9669f7067825ce47: Status 404 returned error can't find the container with id 4dbb609ebaa299a4e43916e94be6fb6bd63d03b345698adc9669f7067825ce47 Feb 02 10:56:28 crc kubenswrapper[4901]: W0202 10:56:28.239121 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1644a512_eb6b_4023_9961_f42ff4bdbbe6.slice/crio-5fb8ac04162696926e6bc7d5808b8a2ee8f90b563758c1ff2e6b1c357491234e WatchSource:0}: Error finding container 5fb8ac04162696926e6bc7d5808b8a2ee8f90b563758c1ff2e6b1c357491234e: Status 404 returned error can't find the container with id 5fb8ac04162696926e6bc7d5808b8a2ee8f90b563758c1ff2e6b1c357491234e Feb 02 10:56:28 crc kubenswrapper[4901]: W0202 10:56:28.245736 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c1d42a9_290c_4bff_b0cb_89507651dae4.slice/crio-8de865779c0cbf0f76b9aa15503bb0f8db59f29b1eb7624277601e378d1519b4 WatchSource:0}: Error finding container 8de865779c0cbf0f76b9aa15503bb0f8db59f29b1eb7624277601e378d1519b4: Status 404 returned error can't find the container with id 8de865779c0cbf0f76b9aa15503bb0f8db59f29b1eb7624277601e378d1519b4 Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.287311 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.287356 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.287373 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8829fb9-be39-448c-9f96-cfc98534248a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.509722 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ff445bb86-z9hcr" event={"ID":"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c","Type":"ContainerStarted","Data":"9aaf29b40fc4dd2aae10ed78633a613fafb9a6091fb1eba271b888f36f786205"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.566014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" event={"ID":"f8829fb9-be39-448c-9f96-cfc98534248a","Type":"ContainerDied","Data":"a4a2c6c99afc5c27df2d2575151d3e6ab9f6fe620c539f8931f69c934dd97e63"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.566486 4901 scope.go:117] "RemoveContainer" containerID="1c8fd39bee50f2445e992bbd4a36665ce4181637e5c8385b6dea1225f6abc6a7" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.566106 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-ccq56" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.619314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerStarted","Data":"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.619611 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.619847 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.630488 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558d599c79-dct6d" event={"ID":"1644a512-eb6b-4023-9961-f42ff4bdbbe6","Type":"ContainerStarted","Data":"5fb8ac04162696926e6bc7d5808b8a2ee8f90b563758c1ff2e6b1c357491234e"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.638466 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerStarted","Data":"9bbdffbb00bfcdcb96c52ae8eb1129593f7c80e78e2d635b9af82caa867fb803"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.648879 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5fd59b56d-fwr5x" podStartSLOduration=3.648862825 podStartE2EDuration="3.648862825s" podCreationTimestamp="2026-02-02 10:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:28.646001243 +0000 UTC m=+1075.664341349" watchObservedRunningTime="2026-02-02 10:56:28.648862825 +0000 UTC m=+1075.667202921" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.660971 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerStarted","Data":"dd31a321686489458a68e134f1e485225e3d1ed31b20d173cc78521cb64e0c81"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.678452 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerStarted","Data":"be7217ed0843fe1b0dd834487935a7849f3f9277fa0672cce1b79674f0f7b710"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.686947 4901 scope.go:117] "RemoveContainer" containerID="16ff4ede99b1b57c3e05140e58964789873adf3691a6320ab695e43ac3b48ae1" Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.691397 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" event={"ID":"7c1d42a9-290c-4bff-b0cb-89507651dae4","Type":"ContainerStarted","Data":"8de865779c0cbf0f76b9aa15503bb0f8db59f29b1eb7624277601e378d1519b4"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.692113 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.703775 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" event={"ID":"06feef03-ccd0-4fa8-8939-852df217d3a1","Type":"ContainerStarted","Data":"4dbb609ebaa299a4e43916e94be6fb6bd63d03b345698adc9669f7067825ce47"} Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.717464 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-ccq56"] Feb 02 10:56:28 crc kubenswrapper[4901]: I0202 10:56:28.732821 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d7d985874-pzxvf" event={"ID":"9c918cc4-c647-4e53-8800-27ea182ea861","Type":"ContainerStarted","Data":"22b2bc9b46325d0546d37ce5a5253b10bb05c2dd2b12977ee2ebb541da036c92"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.395000 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.395415 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.581423 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5964cf7bcd-rmmzk"] Feb 02 10:56:29 crc kubenswrapper[4901]: E0202 10:56:29.583025 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="dnsmasq-dns" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.583116 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="dnsmasq-dns" Feb 02 10:56:29 crc kubenswrapper[4901]: E0202 10:56:29.583212 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="init" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.583265 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="init" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.587462 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" containerName="dnsmasq-dns" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.588817 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.600347 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.600767 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.630259 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5964cf7bcd-rmmzk"] Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744776 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8829fb9-be39-448c-9f96-cfc98534248a" path="/var/lib/kubelet/pods/f8829fb9-be39-448c-9f96-cfc98534248a/volumes" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744777 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-public-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744837 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-combined-ca-bundle\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744860 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744907 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-internal-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744968 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb942871-a2e5-420e-a570-962659f75886-logs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.744993 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfdb\" (UniqueName: \"kubernetes.io/projected/fb942871-a2e5-420e-a570-962659f75886-kube-api-access-nrfdb\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.745039 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data-custom\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.747372 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.835354 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ff445bb86-z9hcr" event={"ID":"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c","Type":"ContainerStarted","Data":"53588e9ca14050605bf7617dcc15cf9d9fb18fa5bee3587940df101dee1d6dc3"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.835436 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ff445bb86-z9hcr" event={"ID":"37a5ab35-b3f0-4065-bcbe-5c784dd6d02c","Type":"ContainerStarted","Data":"05ce9ca5af5190b4d3e5ec2707d582863ea997e070a605b30aaa4a93e00b2d68"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.835548 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849221 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb942871-a2e5-420e-a570-962659f75886-logs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849341 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfdb\" (UniqueName: \"kubernetes.io/projected/fb942871-a2e5-420e-a570-962659f75886-kube-api-access-nrfdb\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849437 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data-custom\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849499 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-public-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849540 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-combined-ca-bundle\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849595 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.849713 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-internal-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.853397 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb942871-a2e5-420e-a570-962659f75886-logs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.857115 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-combined-ca-bundle\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.858262 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data-custom\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.858464 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-config-data\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.859276 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-internal-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.866937 4901 generic.go:334] "Generic (PLEG): container finished" podID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerID="550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f" exitCode=0 Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.867045 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" event={"ID":"06feef03-ccd0-4fa8-8939-852df217d3a1","Type":"ContainerDied","Data":"550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.881110 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfdb\" (UniqueName: \"kubernetes.io/projected/fb942871-a2e5-420e-a570-962659f75886-kube-api-access-nrfdb\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.882298 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb942871-a2e5-420e-a570-962659f75886-public-tls-certs\") pod \"barbican-api-5964cf7bcd-rmmzk\" (UID: \"fb942871-a2e5-420e-a570-962659f75886\") " pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.891106 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d7d985874-pzxvf" event={"ID":"9c918cc4-c647-4e53-8800-27ea182ea861","Type":"ContainerStarted","Data":"1d6a3d48b187356770a7dc42d21d979ba33886ac8702d459b4db8269da7a1698"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.891751 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.918046 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7ff445bb86-z9hcr" podStartSLOduration=3.918024622 podStartE2EDuration="3.918024622s" podCreationTimestamp="2026-02-02 10:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:29.883316436 +0000 UTC m=+1076.901656552" watchObservedRunningTime="2026-02-02 10:56:29.918024622 +0000 UTC m=+1076.936364718" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.935264 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.939339 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerStarted","Data":"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.939368 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerStarted","Data":"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49"} Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.940836 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.940863 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:29 crc kubenswrapper[4901]: I0202 10:56:29.976258 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d7d985874-pzxvf" podStartSLOduration=4.976236568 podStartE2EDuration="4.976236568s" podCreationTimestamp="2026-02-02 10:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:29.934221388 +0000 UTC m=+1076.952561484" watchObservedRunningTime="2026-02-02 10:56:29.976236568 +0000 UTC m=+1076.994576654" Feb 02 10:56:30 crc kubenswrapper[4901]: I0202 10:56:30.001816 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b6f6785c4-59xw6" podStartSLOduration=4.001792196 podStartE2EDuration="4.001792196s" podCreationTimestamp="2026-02-02 10:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:29.962296799 +0000 UTC m=+1076.980636895" watchObservedRunningTime="2026-02-02 10:56:30.001792196 +0000 UTC m=+1077.020132302" Feb 02 10:56:30 crc kubenswrapper[4901]: I0202 10:56:30.953616 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hz6j" event={"ID":"9dc5e40d-ef90-4040-a247-114b55e0efa1","Type":"ContainerStarted","Data":"0d4b36deec608eaf5895e2cca6089f88861ab4ce5d89ffe6b8e3debc06e6b9c8"} Feb 02 10:56:30 crc kubenswrapper[4901]: I0202 10:56:30.954160 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:31 crc kubenswrapper[4901]: I0202 10:56:31.466657 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2hz6j" podStartSLOduration=9.456260635 podStartE2EDuration="51.466417399s" podCreationTimestamp="2026-02-02 10:55:40 +0000 UTC" firstStartedPulling="2026-02-02 10:55:46.270014517 +0000 UTC m=+1033.288354613" lastFinishedPulling="2026-02-02 10:56:28.280171281 +0000 UTC m=+1075.298511377" observedRunningTime="2026-02-02 10:56:30.980828793 +0000 UTC m=+1077.999168959" watchObservedRunningTime="2026-02-02 10:56:31.466417399 +0000 UTC m=+1078.484757495" Feb 02 10:56:31 crc kubenswrapper[4901]: I0202 10:56:31.472174 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5964cf7bcd-rmmzk"] Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.005508 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558d599c79-dct6d" event={"ID":"1644a512-eb6b-4023-9961-f42ff4bdbbe6","Type":"ContainerStarted","Data":"1ec4d99af932f41271f28449e8f74f30cfd49506b55c74381ccedd475ea9b95f"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.005980 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558d599c79-dct6d" event={"ID":"1644a512-eb6b-4023-9961-f42ff4bdbbe6","Type":"ContainerStarted","Data":"a6a062d6b26d2202463051c2e70e02bd8e06b8ce3484f0cd748abe3553cf02b4"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.010274 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" event={"ID":"7c1d42a9-290c-4bff-b0cb-89507651dae4","Type":"ContainerStarted","Data":"1b7597f105d538ae7a3df2ca012a533b82dab03fbd1f01f9664bae2e06ffb3b3"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.010362 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" event={"ID":"7c1d42a9-290c-4bff-b0cb-89507651dae4","Type":"ContainerStarted","Data":"740d191211ab63c5eb305b00e671a23996e8a73e8dc67e25428b18a5c5339ca7"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.032206 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" event={"ID":"06feef03-ccd0-4fa8-8939-852df217d3a1","Type":"ContainerStarted","Data":"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.033467 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.040347 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5964cf7bcd-rmmzk" event={"ID":"fb942871-a2e5-420e-a570-962659f75886","Type":"ContainerStarted","Data":"742e08c2c8669f23257f2fef49fb28b122391383c08e6c6f497adb621cde03cd"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.040412 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5964cf7bcd-rmmzk" event={"ID":"fb942871-a2e5-420e-a570-962659f75886","Type":"ContainerStarted","Data":"dc827d7e30eb923d14d9721a8ea722679f13bc738eeb5bf8b7251ef0e3d6e3b7"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.052008 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerStarted","Data":"4cb5be31f0c092b4b97d463491c1887f651c85e8737b944ba784cb93e0c10f1b"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.052066 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerStarted","Data":"f40c657600f9277f3ead415cce7750856c2985a14e85fdfd014d59e173f909ae"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.064737 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-558d599c79-dct6d" podStartSLOduration=3.232956193 podStartE2EDuration="6.064698052s" podCreationTimestamp="2026-02-02 10:56:26 +0000 UTC" firstStartedPulling="2026-02-02 10:56:28.247035552 +0000 UTC m=+1075.265375648" lastFinishedPulling="2026-02-02 10:56:31.078777411 +0000 UTC m=+1078.097117507" observedRunningTime="2026-02-02 10:56:32.02743619 +0000 UTC m=+1079.045776286" watchObservedRunningTime="2026-02-02 10:56:32.064698052 +0000 UTC m=+1079.083038148" Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.065703 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerStarted","Data":"20a8de42e391b2860c8c909855c7e8622bb94fda8c83981aee8c4b325806548b"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.065747 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerStarted","Data":"7a5d1da6085ee4fc545644c663fe3ced96de950e29b578064bb1d6eb7975e48f"} Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.103119 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7b4d58f5d-kkj8n" podStartSLOduration=3.31648578 podStartE2EDuration="6.103093501s" podCreationTimestamp="2026-02-02 10:56:26 +0000 UTC" firstStartedPulling="2026-02-02 10:56:28.251374991 +0000 UTC m=+1075.269715097" lastFinishedPulling="2026-02-02 10:56:31.037982722 +0000 UTC m=+1078.056322818" observedRunningTime="2026-02-02 10:56:32.054175548 +0000 UTC m=+1079.072515644" watchObservedRunningTime="2026-02-02 10:56:32.103093501 +0000 UTC m=+1079.121433597" Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.120983 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.128533 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5db554c9b9-tv2zt" podStartSLOduration=3.800840052 podStartE2EDuration="7.128504516s" podCreationTimestamp="2026-02-02 10:56:25 +0000 UTC" firstStartedPulling="2026-02-02 10:56:27.640992466 +0000 UTC m=+1074.659332562" lastFinishedPulling="2026-02-02 10:56:30.96865693 +0000 UTC m=+1077.986997026" observedRunningTime="2026-02-02 10:56:32.082929627 +0000 UTC m=+1079.101269723" watchObservedRunningTime="2026-02-02 10:56:32.128504516 +0000 UTC m=+1079.146844612" Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.145104 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" podStartSLOduration=7.14507955 podStartE2EDuration="7.14507955s" podCreationTimestamp="2026-02-02 10:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:32.10264755 +0000 UTC m=+1079.120987646" watchObservedRunningTime="2026-02-02 10:56:32.14507955 +0000 UTC m=+1079.163419646" Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.160309 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:32 crc kubenswrapper[4901]: I0202 10:56:32.166418 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" podStartSLOduration=3.981946838 podStartE2EDuration="7.166391663s" podCreationTimestamp="2026-02-02 10:56:25 +0000 UTC" firstStartedPulling="2026-02-02 10:56:27.868320367 +0000 UTC m=+1074.886660463" lastFinishedPulling="2026-02-02 10:56:31.052765192 +0000 UTC m=+1078.071105288" observedRunningTime="2026-02-02 10:56:32.135588433 +0000 UTC m=+1079.153928529" watchObservedRunningTime="2026-02-02 10:56:32.166391663 +0000 UTC m=+1079.184731759" Feb 02 10:56:33 crc kubenswrapper[4901]: I0202 10:56:33.082894 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5964cf7bcd-rmmzk" event={"ID":"fb942871-a2e5-420e-a570-962659f75886","Type":"ContainerStarted","Data":"5dfbea57d57f211642054989795572d656caea81d464422ea204d5cee4decda6"} Feb 02 10:56:33 crc kubenswrapper[4901]: I0202 10:56:33.125535 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5964cf7bcd-rmmzk" podStartSLOduration=4.125510772 podStartE2EDuration="4.125510772s" podCreationTimestamp="2026-02-02 10:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:33.118450656 +0000 UTC m=+1080.136790782" watchObservedRunningTime="2026-02-02 10:56:33.125510772 +0000 UTC m=+1080.143850888" Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.095960 4901 generic.go:334] "Generic (PLEG): container finished" podID="cccec66b-6bb1-4799-9385-73a33d1cacec" containerID="ed61e656b80a17f0b5b5f0d5c3a178ca9ce7d41172db50d2e265e1c1b96489f2" exitCode=0 Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.096038 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hcpj4" event={"ID":"cccec66b-6bb1-4799-9385-73a33d1cacec","Type":"ContainerDied","Data":"ed61e656b80a17f0b5b5f0d5c3a178ca9ce7d41172db50d2e265e1c1b96489f2"} Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.097028 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5db554c9b9-tv2zt" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker-log" containerID="cri-o://f40c657600f9277f3ead415cce7750856c2985a14e85fdfd014d59e173f909ae" gracePeriod=30 Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.097133 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5db554c9b9-tv2zt" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker" containerID="cri-o://4cb5be31f0c092b4b97d463491c1887f651c85e8737b944ba784cb93e0c10f1b" gracePeriod=30 Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.098054 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.098075 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.098167 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener-log" containerID="cri-o://7a5d1da6085ee4fc545644c663fe3ced96de950e29b578064bb1d6eb7975e48f" gracePeriod=30 Feb 02 10:56:34 crc kubenswrapper[4901]: I0202 10:56:34.098230 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener" containerID="cri-o://20a8de42e391b2860c8c909855c7e8622bb94fda8c83981aee8c4b325806548b" gracePeriod=30 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.112431 4901 generic.go:334] "Generic (PLEG): container finished" podID="32596207-e054-472e-bd8b-9d4bdc150142" containerID="4cb5be31f0c092b4b97d463491c1887f651c85e8737b944ba784cb93e0c10f1b" exitCode=0 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.112493 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerDied","Data":"4cb5be31f0c092b4b97d463491c1887f651c85e8737b944ba784cb93e0c10f1b"} Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.112517 4901 generic.go:334] "Generic (PLEG): container finished" podID="32596207-e054-472e-bd8b-9d4bdc150142" containerID="f40c657600f9277f3ead415cce7750856c2985a14e85fdfd014d59e173f909ae" exitCode=143 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.112544 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerDied","Data":"f40c657600f9277f3ead415cce7750856c2985a14e85fdfd014d59e173f909ae"} Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.114775 4901 generic.go:334] "Generic (PLEG): container finished" podID="9dc5e40d-ef90-4040-a247-114b55e0efa1" containerID="0d4b36deec608eaf5895e2cca6089f88861ab4ce5d89ffe6b8e3debc06e6b9c8" exitCode=0 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.114834 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hz6j" event={"ID":"9dc5e40d-ef90-4040-a247-114b55e0efa1","Type":"ContainerDied","Data":"0d4b36deec608eaf5895e2cca6089f88861ab4ce5d89ffe6b8e3debc06e6b9c8"} Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.117757 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f63f610-92a5-478b-af86-f103565142f0" containerID="20a8de42e391b2860c8c909855c7e8622bb94fda8c83981aee8c4b325806548b" exitCode=0 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.117797 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f63f610-92a5-478b-af86-f103565142f0" containerID="7a5d1da6085ee4fc545644c663fe3ced96de950e29b578064bb1d6eb7975e48f" exitCode=143 Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.117857 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerDied","Data":"20a8de42e391b2860c8c909855c7e8622bb94fda8c83981aee8c4b325806548b"} Feb 02 10:56:35 crc kubenswrapper[4901]: I0202 10:56:35.117937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerDied","Data":"7a5d1da6085ee4fc545644c663fe3ced96de950e29b578064bb1d6eb7975e48f"} Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.728773 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.733291 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hcpj4" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.771836 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890012 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890105 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle\") pod \"cccec66b-6bb1-4799-9385-73a33d1cacec\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890138 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890244 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890279 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890365 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbkdq\" (UniqueName: \"kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890636 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data\") pod \"cccec66b-6bb1-4799-9385-73a33d1cacec\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890684 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data\") pod \"9dc5e40d-ef90-4040-a247-114b55e0efa1\" (UID: \"9dc5e40d-ef90-4040-a247-114b55e0efa1\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.890708 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs7wj\" (UniqueName: \"kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj\") pod \"cccec66b-6bb1-4799-9385-73a33d1cacec\" (UID: \"cccec66b-6bb1-4799-9385-73a33d1cacec\") " Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.896150 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.913052 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.914841 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj" (OuterVolumeSpecName: "kube-api-access-vs7wj") pod "cccec66b-6bb1-4799-9385-73a33d1cacec" (UID: "cccec66b-6bb1-4799-9385-73a33d1cacec"). InnerVolumeSpecName "kube-api-access-vs7wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.914997 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts" (OuterVolumeSpecName: "scripts") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.919808 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq" (OuterVolumeSpecName: "kube-api-access-gbkdq") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "kube-api-access-gbkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.928782 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.929132 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="dnsmasq-dns" containerID="cri-o://6b3b89ac056ad9f26282eb444590aecfab59a1f9dda66e66ec6ca767e24e2584" gracePeriod=10 Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.975547 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.984852 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cccec66b-6bb1-4799-9385-73a33d1cacec" (UID: "cccec66b-6bb1-4799-9385-73a33d1cacec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.995965 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs7wj\" (UniqueName: \"kubernetes.io/projected/cccec66b-6bb1-4799-9385-73a33d1cacec-kube-api-access-vs7wj\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996011 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996022 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996031 4901 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996040 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9dc5e40d-ef90-4040-a247-114b55e0efa1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996052 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:36 crc kubenswrapper[4901]: I0202 10:56:36.996063 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbkdq\" (UniqueName: \"kubernetes.io/projected/9dc5e40d-ef90-4040-a247-114b55e0efa1-kube-api-access-gbkdq\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.000842 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data" (OuterVolumeSpecName: "config-data") pod "9dc5e40d-ef90-4040-a247-114b55e0efa1" (UID: "9dc5e40d-ef90-4040-a247-114b55e0efa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.029707 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data" (OuterVolumeSpecName: "config-data") pod "cccec66b-6bb1-4799-9385-73a33d1cacec" (UID: "cccec66b-6bb1-4799-9385-73a33d1cacec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.098698 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccec66b-6bb1-4799-9385-73a33d1cacec-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.099246 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc5e40d-ef90-4040-a247-114b55e0efa1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.162594 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hcpj4" event={"ID":"cccec66b-6bb1-4799-9385-73a33d1cacec","Type":"ContainerDied","Data":"a0011b3a125318df64e2ede5681a7b6ea74e74dd0b5f0ace3d187e256ace5241"} Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.162670 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0011b3a125318df64e2ede5681a7b6ea74e74dd0b5f0ace3d187e256ace5241" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.162763 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hcpj4" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.166641 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2hz6j" event={"ID":"9dc5e40d-ef90-4040-a247-114b55e0efa1","Type":"ContainerDied","Data":"b27295f11cc7d6ec09fe98abae9dc2a51d3ac0bc2653a0fd1919338470956ab8"} Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.166787 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27295f11cc7d6ec09fe98abae9dc2a51d3ac0bc2653a0fd1919338470956ab8" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.166945 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2hz6j" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.354935 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.455873 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:37 crc kubenswrapper[4901]: E0202 10:56:37.456762 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" containerName="heat-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.456780 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" containerName="heat-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: E0202 10:56:37.456809 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" containerName="cinder-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.456817 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" containerName="cinder-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.457040 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" containerName="cinder-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.457062 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" containerName="heat-db-sync" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.459227 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.470027 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.470318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.470473 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mzgkh" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.470779 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.511072 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.557117 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.559145 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.598803 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611352 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611412 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611456 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611508 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611545 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmh4\" (UniqueName: \"kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611578 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611601 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvc6\" (UniqueName: \"kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611629 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611658 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611681 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.611713 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713248 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713296 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713328 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmh4\" (UniqueName: \"kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713345 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713368 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvc6\" (UniqueName: \"kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713398 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713429 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713458 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713496 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713542 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.713606 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.714050 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.714520 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.717477 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.718380 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.719048 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.719472 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.738921 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.745367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.752439 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.753414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmh4\" (UniqueName: \"kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4\") pod \"dnsmasq-dns-6578955fd5-pt7c2\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.755823 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvc6\" (UniqueName: \"kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.760751 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.762643 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.781499 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.783281 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.784091 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.828857 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.896131 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.919702 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.919795 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.919963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67d79\" (UniqueName: \"kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.920010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.920047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.920121 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:37 crc kubenswrapper[4901]: I0202 10:56:37.920159 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022223 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67d79\" (UniqueName: \"kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022276 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022312 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022355 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022391 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022504 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022527 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.022657 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.023055 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.026824 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.028815 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.030067 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.037066 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.044081 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67d79\" (UniqueName: \"kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79\") pod \"cinder-api-0\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.173523 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.190133 4901 generic.go:334] "Generic (PLEG): container finished" podID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerID="6b3b89ac056ad9f26282eb444590aecfab59a1f9dda66e66ec6ca767e24e2584" exitCode=0 Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.190203 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" event={"ID":"92d9658c-9dc4-466c-b261-dba41f7418ae","Type":"ContainerDied","Data":"6b3b89ac056ad9f26282eb444590aecfab59a1f9dda66e66ec6ca767e24e2584"} Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.382424 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.390745 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532212 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs\") pod \"1f63f610-92a5-478b-af86-f103565142f0\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532338 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfjl5\" (UniqueName: \"kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5\") pod \"32596207-e054-472e-bd8b-9d4bdc150142\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532381 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs\") pod \"32596207-e054-472e-bd8b-9d4bdc150142\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532410 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom\") pod \"32596207-e054-472e-bd8b-9d4bdc150142\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532449 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdxhl\" (UniqueName: \"kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl\") pod \"1f63f610-92a5-478b-af86-f103565142f0\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532592 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data\") pod \"32596207-e054-472e-bd8b-9d4bdc150142\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532634 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle\") pod \"1f63f610-92a5-478b-af86-f103565142f0\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532671 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data\") pod \"1f63f610-92a5-478b-af86-f103565142f0\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532747 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom\") pod \"1f63f610-92a5-478b-af86-f103565142f0\" (UID: \"1f63f610-92a5-478b-af86-f103565142f0\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532800 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle\") pod \"32596207-e054-472e-bd8b-9d4bdc150142\" (UID: \"32596207-e054-472e-bd8b-9d4bdc150142\") " Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.532992 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs" (OuterVolumeSpecName: "logs") pod "1f63f610-92a5-478b-af86-f103565142f0" (UID: "1f63f610-92a5-478b-af86-f103565142f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.533820 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f63f610-92a5-478b-af86-f103565142f0-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.533858 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs" (OuterVolumeSpecName: "logs") pod "32596207-e054-472e-bd8b-9d4bdc150142" (UID: "32596207-e054-472e-bd8b-9d4bdc150142"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.541191 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32596207-e054-472e-bd8b-9d4bdc150142" (UID: "32596207-e054-472e-bd8b-9d4bdc150142"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.541621 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f63f610-92a5-478b-af86-f103565142f0" (UID: "1f63f610-92a5-478b-af86-f103565142f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.542431 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5" (OuterVolumeSpecName: "kube-api-access-dfjl5") pod "32596207-e054-472e-bd8b-9d4bdc150142" (UID: "32596207-e054-472e-bd8b-9d4bdc150142"). InnerVolumeSpecName "kube-api-access-dfjl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.545368 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl" (OuterVolumeSpecName: "kube-api-access-gdxhl") pod "1f63f610-92a5-478b-af86-f103565142f0" (UID: "1f63f610-92a5-478b-af86-f103565142f0"). InnerVolumeSpecName "kube-api-access-gdxhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.569340 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32596207-e054-472e-bd8b-9d4bdc150142" (UID: "32596207-e054-472e-bd8b-9d4bdc150142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.585771 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f63f610-92a5-478b-af86-f103565142f0" (UID: "1f63f610-92a5-478b-af86-f103565142f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.631706 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data" (OuterVolumeSpecName: "config-data") pod "1f63f610-92a5-478b-af86-f103565142f0" (UID: "1f63f610-92a5-478b-af86-f103565142f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635422 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635519 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635607 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f63f610-92a5-478b-af86-f103565142f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635673 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635725 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfjl5\" (UniqueName: \"kubernetes.io/projected/32596207-e054-472e-bd8b-9d4bdc150142-kube-api-access-dfjl5\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635786 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32596207-e054-472e-bd8b-9d4bdc150142-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635843 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.635896 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdxhl\" (UniqueName: \"kubernetes.io/projected/1f63f610-92a5-478b-af86-f103565142f0-kube-api-access-gdxhl\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.638756 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data" (OuterVolumeSpecName: "config-data") pod "32596207-e054-472e-bd8b-9d4bdc150142" (UID: "32596207-e054-472e-bd8b-9d4bdc150142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.740223 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32596207-e054-472e-bd8b-9d4bdc150142-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.788422 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:38 crc kubenswrapper[4901]: I0202 10:56:38.907864 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.013391 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163310 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163466 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163556 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163616 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gf2\" (UniqueName: \"kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.163651 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb\") pod \"92d9658c-9dc4-466c-b261-dba41f7418ae\" (UID: \"92d9658c-9dc4-466c-b261-dba41f7418ae\") " Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.181647 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2" (OuterVolumeSpecName: "kube-api-access-f7gf2") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "kube-api-access-f7gf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.233105 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" event={"ID":"92d9658c-9dc4-466c-b261-dba41f7418ae","Type":"ContainerDied","Data":"40b9a40e4e0e6ad711bbb16a145fbdc74b7f835531d5b49d63da4122bd39b81f"} Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.233169 4901 scope.go:117] "RemoveContainer" containerID="6b3b89ac056ad9f26282eb444590aecfab59a1f9dda66e66ec6ca767e24e2584" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.233310 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xnq2r" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.260328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db554c9b9-tv2zt" event={"ID":"32596207-e054-472e-bd8b-9d4bdc150142","Type":"ContainerDied","Data":"9bbdffbb00bfcdcb96c52ae8eb1129593f7c80e78e2d635b9af82caa867fb803"} Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.260928 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db554c9b9-tv2zt" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.262551 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config" (OuterVolumeSpecName: "config") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.266706 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.267618 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7gf2\" (UniqueName: \"kubernetes.io/projected/92d9658c-9dc4-466c-b261-dba41f7418ae-kube-api-access-f7gf2\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.293163 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" event={"ID":"1f63f610-92a5-478b-af86-f103565142f0","Type":"ContainerDied","Data":"dd31a321686489458a68e134f1e485225e3d1ed31b20d173cc78521cb64e0c81"} Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.293312 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474dc9c5d-ld7wj" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.300666 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerStarted","Data":"392366ceb54650a9f1c4bec69f5a834d6e7e35451d08fc6cc96eb8c0b069e7f0"} Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.300798 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-central-agent" containerID="cri-o://21d420e67e03baa42b605453c70ede0e38fa51c6be0019b4aa5c08b0672700e1" gracePeriod=30 Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.300839 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.300963 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="proxy-httpd" containerID="cri-o://392366ceb54650a9f1c4bec69f5a834d6e7e35451d08fc6cc96eb8c0b069e7f0" gracePeriod=30 Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.301016 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="sg-core" containerID="cri-o://cf3026123265208ba8e79765049e8a67efd3f711fdfd72fd0114774ac279846a" gracePeriod=30 Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.301051 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-notification-agent" containerID="cri-o://bbfe6147231cfcbd29282e9a4f6ed42803490aa7ffeb09eb13e27fe0e10eb483" gracePeriod=30 Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.303526 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.304311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.321422 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.354987 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.364505 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92d9658c-9dc4-466c-b261-dba41f7418ae" (UID: "92d9658c-9dc4-466c-b261-dba41f7418ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.368943 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.368971 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.368983 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.368992 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92d9658c-9dc4-466c-b261-dba41f7418ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.425261 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.3880406579999995 podStartE2EDuration="58.425240342s" podCreationTimestamp="2026-02-02 10:55:41 +0000 UTC" firstStartedPulling="2026-02-02 10:55:46.655178253 +0000 UTC m=+1033.673518349" lastFinishedPulling="2026-02-02 10:56:38.692377937 +0000 UTC m=+1085.710718033" observedRunningTime="2026-02-02 10:56:39.340026213 +0000 UTC m=+1086.358366309" watchObservedRunningTime="2026-02-02 10:56:39.425240342 +0000 UTC m=+1086.443580438" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.435912 4901 scope.go:117] "RemoveContainer" containerID="3ecdb331a985384caae0b085ed3d3fd1d860b4bca182252d53e5dbd54e420509" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.442595 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.460180 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5db554c9b9-tv2zt"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.483697 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.513299 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:39 crc kubenswrapper[4901]: W0202 10:56:39.518351 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba1dd4f_2e2c_4055_a605_35de3b73bdb4.slice/crio-29ba11d93f1b0fed419c069079f1ddc84687c5cfc7ad03cc1a9207dd77a0d7b3 WatchSource:0}: Error finding container 29ba11d93f1b0fed419c069079f1ddc84687c5cfc7ad03cc1a9207dd77a0d7b3: Status 404 returned error can't find the container with id 29ba11d93f1b0fed419c069079f1ddc84687c5cfc7ad03cc1a9207dd77a0d7b3 Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.528661 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8474dc9c5d-ld7wj"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.559462 4901 scope.go:117] "RemoveContainer" containerID="4cb5be31f0c092b4b97d463491c1887f651c85e8737b944ba784cb93e0c10f1b" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.562321 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.622786 4901 scope.go:117] "RemoveContainer" containerID="f40c657600f9277f3ead415cce7750856c2985a14e85fdfd014d59e173f909ae" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.637508 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.650645 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xnq2r"] Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.665909 4901 scope.go:117] "RemoveContainer" containerID="20a8de42e391b2860c8c909855c7e8622bb94fda8c83981aee8c4b325806548b" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.691864 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f63f610-92a5-478b-af86-f103565142f0" path="/var/lib/kubelet/pods/1f63f610-92a5-478b-af86-f103565142f0/volumes" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.692507 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32596207-e054-472e-bd8b-9d4bdc150142" path="/var/lib/kubelet/pods/32596207-e054-472e-bd8b-9d4bdc150142/volumes" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.693116 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" path="/var/lib/kubelet/pods/92d9658c-9dc4-466c-b261-dba41f7418ae/volumes" Feb 02 10:56:39 crc kubenswrapper[4901]: I0202 10:56:39.695850 4901 scope.go:117] "RemoveContainer" containerID="7a5d1da6085ee4fc545644c663fe3ced96de950e29b578064bb1d6eb7975e48f" Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.167369 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.399915 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerStarted","Data":"f07a44037ba0d536225377df668b181ef4a5c9ef3184036fe7cb711d29f5c77f"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.415939 4901 generic.go:334] "Generic (PLEG): container finished" podID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerID="ae1c08757c419f8ea11e1ae886b36f800e4b8399f50119859178549866040f24" exitCode=0 Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.416013 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" event={"ID":"5e62ea67-c139-4682-8563-441d8b7aeae6","Type":"ContainerDied","Data":"ae1c08757c419f8ea11e1ae886b36f800e4b8399f50119859178549866040f24"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.416042 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" event={"ID":"5e62ea67-c139-4682-8563-441d8b7aeae6","Type":"ContainerStarted","Data":"de170f5b032031e776d168887f1e1aca9cebb587a962e7c88596ea7f12ba23b6"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.430890 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerStarted","Data":"29ba11d93f1b0fed419c069079f1ddc84687c5cfc7ad03cc1a9207dd77a0d7b3"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.472945 4901 generic.go:334] "Generic (PLEG): container finished" podID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerID="392366ceb54650a9f1c4bec69f5a834d6e7e35451d08fc6cc96eb8c0b069e7f0" exitCode=0 Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.473008 4901 generic.go:334] "Generic (PLEG): container finished" podID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerID="cf3026123265208ba8e79765049e8a67efd3f711fdfd72fd0114774ac279846a" exitCode=2 Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.473017 4901 generic.go:334] "Generic (PLEG): container finished" podID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerID="21d420e67e03baa42b605453c70ede0e38fa51c6be0019b4aa5c08b0672700e1" exitCode=0 Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.473040 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerDied","Data":"392366ceb54650a9f1c4bec69f5a834d6e7e35451d08fc6cc96eb8c0b069e7f0"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.473071 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerDied","Data":"cf3026123265208ba8e79765049e8a67efd3f711fdfd72fd0114774ac279846a"} Feb 02 10:56:40 crc kubenswrapper[4901]: I0202 10:56:40.473105 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerDied","Data":"21d420e67e03baa42b605453c70ede0e38fa51c6be0019b4aa5c08b0672700e1"} Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.504509 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerStarted","Data":"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71"} Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.508933 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" event={"ID":"5e62ea67-c139-4682-8563-441d8b7aeae6","Type":"ContainerStarted","Data":"2bfe02d678cf7cf16745542b57b43a79e00948d372eece84ee5ee46ad2cb9eba"} Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.509171 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.518779 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerStarted","Data":"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9"} Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.541088 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" podStartSLOduration=4.54106796 podStartE2EDuration="4.54106796s" podCreationTimestamp="2026-02-02 10:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:41.527234464 +0000 UTC m=+1088.545574560" watchObservedRunningTime="2026-02-02 10:56:41.54106796 +0000 UTC m=+1088.559408056" Feb 02 10:56:41 crc kubenswrapper[4901]: I0202 10:56:41.957206 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.303113 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5964cf7bcd-rmmzk" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.395827 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.396070 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b6f6785c4-59xw6" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" containerID="cri-o://c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49" gracePeriod=30 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.396480 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b6f6785c4-59xw6" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api" containerID="cri-o://545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c" gracePeriod=30 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.410253 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-b6f6785c4-59xw6" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.559917 4901 generic.go:334] "Generic (PLEG): container finished" podID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerID="bbfe6147231cfcbd29282e9a4f6ed42803490aa7ffeb09eb13e27fe0e10eb483" exitCode=0 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.559988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerDied","Data":"bbfe6147231cfcbd29282e9a4f6ed42803490aa7ffeb09eb13e27fe0e10eb483"} Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.561631 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerStarted","Data":"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed"} Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.561887 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api-log" containerID="cri-o://44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" gracePeriod=30 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.562086 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api" containerID="cri-o://2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" gracePeriod=30 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.562091 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.574009 4901 generic.go:334] "Generic (PLEG): container finished" podID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerID="c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49" exitCode=143 Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.574091 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerDied","Data":"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49"} Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.580908 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerStarted","Data":"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20"} Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.587750 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.587732787 podStartE2EDuration="5.587732787s" podCreationTimestamp="2026-02-02 10:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:42.585173553 +0000 UTC m=+1089.603513649" watchObservedRunningTime="2026-02-02 10:56:42.587732787 +0000 UTC m=+1089.606072883" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.616136 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.60545862 podStartE2EDuration="5.616119147s" podCreationTimestamp="2026-02-02 10:56:37 +0000 UTC" firstStartedPulling="2026-02-02 10:56:39.559746944 +0000 UTC m=+1086.578087040" lastFinishedPulling="2026-02-02 10:56:40.570407471 +0000 UTC m=+1087.588747567" observedRunningTime="2026-02-02 10:56:42.61343869 +0000 UTC m=+1089.631778786" watchObservedRunningTime="2026-02-02 10:56:42.616119147 +0000 UTC m=+1089.634459373" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.785473 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:56:42 crc kubenswrapper[4901]: I0202 10:56:42.946407 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061022 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj54w\" (UniqueName: \"kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061067 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061162 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061443 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061466 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061611 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.061637 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd\") pod \"74392bcb-ba65-45cd-8f9b-32894528aca3\" (UID: \"74392bcb-ba65-45cd-8f9b-32894528aca3\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.063395 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.063867 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.073695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts" (OuterVolumeSpecName: "scripts") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.074005 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w" (OuterVolumeSpecName: "kube-api-access-fj54w") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "kube-api-access-fj54w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.120696 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.166407 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj54w\" (UniqueName: \"kubernetes.io/projected/74392bcb-ba65-45cd-8f9b-32894528aca3-kube-api-access-fj54w\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.166495 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.166510 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.166583 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.166594 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74392bcb-ba65-45cd-8f9b-32894528aca3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.185429 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.265691 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data" (OuterVolumeSpecName: "config-data") pod "74392bcb-ba65-45cd-8f9b-32894528aca3" (UID: "74392bcb-ba65-45cd-8f9b-32894528aca3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.272362 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.272413 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74392bcb-ba65-45cd-8f9b-32894528aca3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.335476 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.373849 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374172 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374778 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374992 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.375440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67d79\" (UniqueName: \"kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79\") pod \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\" (UID: \"314f510e-6a17-412d-9a1d-7108e7f6d9c6\") " Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374781 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs" (OuterVolumeSpecName: "logs") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.374927 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.379055 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts" (OuterVolumeSpecName: "scripts") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.386285 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.386636 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79" (OuterVolumeSpecName: "kube-api-access-67d79") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "kube-api-access-67d79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.413410 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.479455 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.479818 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/314f510e-6a17-412d-9a1d-7108e7f6d9c6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.479944 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.480028 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67d79\" (UniqueName: \"kubernetes.io/projected/314f510e-6a17-412d-9a1d-7108e7f6d9c6-kube-api-access-67d79\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.480099 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.480173 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/314f510e-6a17-412d-9a1d-7108e7f6d9c6-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.498676 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data" (OuterVolumeSpecName: "config-data") pod "314f510e-6a17-412d-9a1d-7108e7f6d9c6" (UID: "314f510e-6a17-412d-9a1d-7108e7f6d9c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.582016 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/314f510e-6a17-412d-9a1d-7108e7f6d9c6-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.594277 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74392bcb-ba65-45cd-8f9b-32894528aca3","Type":"ContainerDied","Data":"006d319c4cf4896605ad00f215c0639675357bb2439be9cb6294d7308674c8a9"} Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.595164 4901 scope.go:117] "RemoveContainer" containerID="392366ceb54650a9f1c4bec69f5a834d6e7e35451d08fc6cc96eb8c0b069e7f0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.594602 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.598981 4901 generic.go:334] "Generic (PLEG): container finished" podID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerID="2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" exitCode=0 Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.599026 4901 generic.go:334] "Generic (PLEG): container finished" podID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerID="44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" exitCode=143 Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.599203 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerDied","Data":"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed"} Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.599322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerDied","Data":"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71"} Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.599408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"314f510e-6a17-412d-9a1d-7108e7f6d9c6","Type":"ContainerDied","Data":"f07a44037ba0d536225377df668b181ef4a5c9ef3184036fe7cb711d29f5c77f"} Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.600586 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.629754 4901 scope.go:117] "RemoveContainer" containerID="cf3026123265208ba8e79765049e8a67efd3f711fdfd72fd0114774ac279846a" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.667270 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.672374 4901 scope.go:117] "RemoveContainer" containerID="bbfe6147231cfcbd29282e9a4f6ed42803490aa7ffeb09eb13e27fe0e10eb483" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.770838 4901 scope.go:117] "RemoveContainer" containerID="21d420e67e03baa42b605453c70ede0e38fa51c6be0019b4aa5c08b0672700e1" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.787332 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.800687 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.807587 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.814884 4901 scope.go:117] "RemoveContainer" containerID="2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.818522 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819335 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="sg-core" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819367 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="sg-core" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819388 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819398 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker-log" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819413 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819420 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api-log" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819433 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819439 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819450 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819457 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819469 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819477 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener-log" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819489 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="dnsmasq-dns" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819495 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="dnsmasq-dns" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819505 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-central-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819511 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-central-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819521 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="init" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819527 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="init" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819539 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819545 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819626 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="proxy-httpd" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819635 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="proxy-httpd" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.819644 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-notification-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819650 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-notification-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819864 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-central-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819873 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d9658c-9dc4-466c-b261-dba41f7418ae" containerName="dnsmasq-dns" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819889 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819901 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819910 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819919 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" containerName="cinder-api" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819928 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f63f610-92a5-478b-af86-f103565142f0" containerName="barbican-keystone-listener-log" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819937 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="proxy-httpd" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819951 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="sg-core" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819964 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" containerName="ceilometer-notification-agent" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.819973 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="32596207-e054-472e-bd8b-9d4bdc150142" containerName="barbican-worker" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.822240 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.825261 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.825589 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.835947 4901 scope.go:117] "RemoveContainer" containerID="44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.843921 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.845833 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.850755 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.850996 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.851508 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.862201 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.879404 4901 scope.go:117] "RemoveContainer" containerID="2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.880073 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed\": container with ID starting with 2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed not found: ID does not exist" containerID="2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880134 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed"} err="failed to get container status \"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed\": rpc error: code = NotFound desc = could not find container \"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed\": container with ID starting with 2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed not found: ID does not exist" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880171 4901 scope.go:117] "RemoveContainer" containerID="44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" Feb 02 10:56:43 crc kubenswrapper[4901]: E0202 10:56:43.880587 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71\": container with ID starting with 44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71 not found: ID does not exist" containerID="44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880628 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71"} err="failed to get container status \"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71\": rpc error: code = NotFound desc = could not find container \"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71\": container with ID starting with 44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71 not found: ID does not exist" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880659 4901 scope.go:117] "RemoveContainer" containerID="2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880935 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed"} err="failed to get container status \"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed\": rpc error: code = NotFound desc = could not find container \"2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed\": container with ID starting with 2162fd2411a4e190e1cf7740d8fa7de5b26aa916c609b1c39139ae7d1ced08ed not found: ID does not exist" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.880960 4901 scope.go:117] "RemoveContainer" containerID="44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.881225 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71"} err="failed to get container status \"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71\": rpc error: code = NotFound desc = could not find container \"44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71\": container with ID starting with 44b854e09d91b335a68209e19419540b17e1a05c6ad25ebc1e93d8bfc5df3d71 not found: ID does not exist" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.887852 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907086 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907157 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c58\" (UniqueName: \"kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907228 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907259 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce854019-a345-4a85-9211-eedd7e33dff3-logs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907299 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907329 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907351 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907532 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907701 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce854019-a345-4a85-9211-eedd7e33dff3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907731 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907774 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.907862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-scripts\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.908184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvshk\" (UniqueName: \"kubernetes.io/projected/ce854019-a345-4a85-9211-eedd7e33dff3-kube-api-access-jvshk\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.908347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:43 crc kubenswrapper[4901]: I0202 10:56:43.908396 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012075 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012191 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012296 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012336 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c58\" (UniqueName: \"kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012376 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012412 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012460 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce854019-a345-4a85-9211-eedd7e33dff3-logs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012492 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012539 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012609 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012676 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012833 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce854019-a345-4a85-9211-eedd7e33dff3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012908 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.012986 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.013103 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-scripts\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.013149 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvshk\" (UniqueName: \"kubernetes.io/projected/ce854019-a345-4a85-9211-eedd7e33dff3-kube-api-access-jvshk\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.013474 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce854019-a345-4a85-9211-eedd7e33dff3-logs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.014707 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce854019-a345-4a85-9211-eedd7e33dff3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.015628 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.015764 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.020304 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data-custom\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.021965 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.022686 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-config-data\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.023470 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.023645 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.024210 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.024468 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-scripts\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.024483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.025699 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.032246 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce854019-a345-4a85-9211-eedd7e33dff3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.036509 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvshk\" (UniqueName: \"kubernetes.io/projected/ce854019-a345-4a85-9211-eedd7e33dff3-kube-api-access-jvshk\") pod \"cinder-api-0\" (UID: \"ce854019-a345-4a85-9211-eedd7e33dff3\") " pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.038733 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c58\" (UniqueName: \"kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58\") pod \"ceilometer-0\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.144436 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.171636 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.703832 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:56:44 crc kubenswrapper[4901]: I0202 10:56:44.711545 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:44 crc kubenswrapper[4901]: W0202 10:56:44.725849 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce854019_a345_4a85_9211_eedd7e33dff3.slice/crio-c2550b52fecbb820edc89b7881235795b6cfb4f300eddca8ec90325775d7893f WatchSource:0}: Error finding container c2550b52fecbb820edc89b7881235795b6cfb4f300eddca8ec90325775d7893f: Status 404 returned error can't find the container with id c2550b52fecbb820edc89b7881235795b6cfb4f300eddca8ec90325775d7893f Feb 02 10:56:45 crc kubenswrapper[4901]: I0202 10:56:45.624977 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerStarted","Data":"4375ab0577a3d8d217d6433ead090f3707ba287695e865de12354c4ea198567f"} Feb 02 10:56:45 crc kubenswrapper[4901]: I0202 10:56:45.628468 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce854019-a345-4a85-9211-eedd7e33dff3","Type":"ContainerStarted","Data":"1f114e08e336ab6f126ffd7aff85d62262b18d0acda2571f455fc4b785252b5a"} Feb 02 10:56:45 crc kubenswrapper[4901]: I0202 10:56:45.628506 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce854019-a345-4a85-9211-eedd7e33dff3","Type":"ContainerStarted","Data":"c2550b52fecbb820edc89b7881235795b6cfb4f300eddca8ec90325775d7893f"} Feb 02 10:56:45 crc kubenswrapper[4901]: I0202 10:56:45.692094 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314f510e-6a17-412d-9a1d-7108e7f6d9c6" path="/var/lib/kubelet/pods/314f510e-6a17-412d-9a1d-7108e7f6d9c6/volumes" Feb 02 10:56:45 crc kubenswrapper[4901]: I0202 10:56:45.693644 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74392bcb-ba65-45cd-8f9b-32894528aca3" path="/var/lib/kubelet/pods/74392bcb-ba65-45cd-8f9b-32894528aca3/volumes" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.180816 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.453851 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.454313 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f4d4785b9-9m2tl" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-api" containerID="cri-o://c370f81bb014213fa8f0ca4747e7a27481b3e911d5436628a3ac6d65de5cab15" gracePeriod=30 Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.466272 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f4d4785b9-9m2tl" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" containerID="cri-o://a49c3f0e054068f2a9fb809af959494d324bc38d409d0642f47ec06aafdac2ad" gracePeriod=30 Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.516271 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d89c9dff9-fzvln"] Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.523756 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.536291 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d89c9dff9-fzvln"] Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.573103 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f4d4785b9-9m2tl" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9696/\": read tcp 10.217.0.2:52984->10.217.0.153:9696: read: connection reset by peer" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.596931 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-httpd-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.596980 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-ovndb-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.597081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-combined-ca-bundle\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.597136 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-public-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.597161 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fdb\" (UniqueName: \"kubernetes.io/projected/172bde5c-4b76-4f03-b899-4a395581a9f5-kube-api-access-96fdb\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.597249 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-internal-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.597311 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.640114 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ce854019-a345-4a85-9211-eedd7e33dff3","Type":"ContainerStarted","Data":"07b16d6d725e7e7df504c73b8faff99b4482338b6f164c467a8ddaf0dcfd6c0a"} Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.640749 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.643556 4901 generic.go:334] "Generic (PLEG): container finished" podID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerID="a49c3f0e054068f2a9fb809af959494d324bc38d409d0642f47ec06aafdac2ad" exitCode=0 Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.643606 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerDied","Data":"a49c3f0e054068f2a9fb809af959494d324bc38d409d0642f47ec06aafdac2ad"} Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.645377 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerStarted","Data":"31145ddf5c2b8a55b22db930db9ae275fd40ebe87f6da75bfe738943ad0660a7"} Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.645417 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerStarted","Data":"007aa571a93db60a7baf10afed2a0b4aba84468020f47a78b334dba3f481bcaa"} Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.660173 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.6601567839999998 podStartE2EDuration="3.660156784s" podCreationTimestamp="2026-02-02 10:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:46.657848356 +0000 UTC m=+1093.676188452" watchObservedRunningTime="2026-02-02 10:56:46.660156784 +0000 UTC m=+1093.678496880" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699394 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-httpd-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699445 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-ovndb-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699578 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-combined-ca-bundle\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699667 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-public-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96fdb\" (UniqueName: \"kubernetes.io/projected/172bde5c-4b76-4f03-b899-4a395581a9f5-kube-api-access-96fdb\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699720 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-internal-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.699759 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.705497 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.705965 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-httpd-config\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.712394 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-internal-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.712785 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-combined-ca-bundle\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.712922 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-ovndb-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.713215 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/172bde5c-4b76-4f03-b899-4a395581a9f5-public-tls-certs\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.718694 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fdb\" (UniqueName: \"kubernetes.io/projected/172bde5c-4b76-4f03-b899-4a395581a9f5-kube-api-access-96fdb\") pod \"neutron-7d89c9dff9-fzvln\" (UID: \"172bde5c-4b76-4f03-b899-4a395581a9f5\") " pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.834921 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-b6f6785c4-59xw6" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:47052->10.217.0.161:9311: read: connection reset by peer" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.835399 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-b6f6785c4-59xw6" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:47064->10.217.0.161:9311: read: connection reset by peer" Feb 02 10:56:46 crc kubenswrapper[4901]: I0202 10:56:46.857732 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.472167 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.528233 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nlwn\" (UniqueName: \"kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn\") pod \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.528362 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data\") pod \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.528440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle\") pod \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.528511 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs\") pod \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.529033 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom\") pod \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\" (UID: \"aa18cb0b-f1e0-4a57-90e0-daeede8334de\") " Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.529339 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs" (OuterVolumeSpecName: "logs") pod "aa18cb0b-f1e0-4a57-90e0-daeede8334de" (UID: "aa18cb0b-f1e0-4a57-90e0-daeede8334de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.529571 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa18cb0b-f1e0-4a57-90e0-daeede8334de-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.538175 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa18cb0b-f1e0-4a57-90e0-daeede8334de" (UID: "aa18cb0b-f1e0-4a57-90e0-daeede8334de"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.547948 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn" (OuterVolumeSpecName: "kube-api-access-5nlwn") pod "aa18cb0b-f1e0-4a57-90e0-daeede8334de" (UID: "aa18cb0b-f1e0-4a57-90e0-daeede8334de"). InnerVolumeSpecName "kube-api-access-5nlwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.572753 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa18cb0b-f1e0-4a57-90e0-daeede8334de" (UID: "aa18cb0b-f1e0-4a57-90e0-daeede8334de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.599404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data" (OuterVolumeSpecName: "config-data") pod "aa18cb0b-f1e0-4a57-90e0-daeede8334de" (UID: "aa18cb0b-f1e0-4a57-90e0-daeede8334de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.631466 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.631500 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nlwn\" (UniqueName: \"kubernetes.io/projected/aa18cb0b-f1e0-4a57-90e0-daeede8334de-kube-api-access-5nlwn\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.631511 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.631522 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa18cb0b-f1e0-4a57-90e0-daeede8334de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.660900 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerStarted","Data":"137cd9c17ee4ac57c1c302e50c86191b1f409a6540089c72fce7948326623312"} Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.662550 4901 generic.go:334] "Generic (PLEG): container finished" podID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerID="545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c" exitCode=0 Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.663717 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b6f6785c4-59xw6" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.664794 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerDied","Data":"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c"} Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.664889 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b6f6785c4-59xw6" event={"ID":"aa18cb0b-f1e0-4a57-90e0-daeede8334de","Type":"ContainerDied","Data":"be7217ed0843fe1b0dd834487935a7849f3f9277fa0672cce1b79674f0f7b710"} Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.664916 4901 scope.go:117] "RemoveContainer" containerID="545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.713690 4901 scope.go:117] "RemoveContainer" containerID="c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.713843 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.724316 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-b6f6785c4-59xw6"] Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.761482 4901 scope.go:117] "RemoveContainer" containerID="545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c" Feb 02 10:56:47 crc kubenswrapper[4901]: E0202 10:56:47.761940 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c\": container with ID starting with 545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c not found: ID does not exist" containerID="545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.761984 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c"} err="failed to get container status \"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c\": rpc error: code = NotFound desc = could not find container \"545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c\": container with ID starting with 545ec88fcdea2c4cf0f01ec1c6abe42c1b30161f98a6a90046a26b78b265743c not found: ID does not exist" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.762012 4901 scope.go:117] "RemoveContainer" containerID="c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49" Feb 02 10:56:47 crc kubenswrapper[4901]: E0202 10:56:47.762435 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49\": container with ID starting with c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49 not found: ID does not exist" containerID="c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.762465 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49"} err="failed to get container status \"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49\": rpc error: code = NotFound desc = could not find container \"c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49\": container with ID starting with c992c90156378e0a9891a669fd7126376d08a9b1c60d0da42356ead9797d5b49 not found: ID does not exist" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.777998 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d89c9dff9-fzvln"] Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.898636 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.980823 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:47 crc kubenswrapper[4901]: I0202 10:56:47.981634 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="dnsmasq-dns" containerID="cri-o://1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05" gracePeriod=10 Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.016624 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.074915 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.632444 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696203 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696362 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sssgf\" (UniqueName: \"kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696403 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696455 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696611 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.696644 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb\") pod \"06feef03-ccd0-4fa8-8939-852df217d3a1\" (UID: \"06feef03-ccd0-4fa8-8939-852df217d3a1\") " Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.698076 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d89c9dff9-fzvln" event={"ID":"172bde5c-4b76-4f03-b899-4a395581a9f5","Type":"ContainerStarted","Data":"5cd8ef90cd37ee182af655d998476388b89089609356c6b45348d69df59ee733"} Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.698136 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d89c9dff9-fzvln" event={"ID":"172bde5c-4b76-4f03-b899-4a395581a9f5","Type":"ContainerStarted","Data":"1177e309244b90f4682ae5b96fb90a24812c5f1054af6a68bd7f80b77635db53"} Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.698148 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d89c9dff9-fzvln" event={"ID":"172bde5c-4b76-4f03-b899-4a395581a9f5","Type":"ContainerStarted","Data":"602e7237d902f817fbfc5f1a5ce3068e9c4fb937b256c512a60abf5be666fd05"} Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.698207 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.744338 4901 generic.go:334] "Generic (PLEG): container finished" podID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerID="1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05" exitCode=0 Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.744815 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="cinder-scheduler" containerID="cri-o://c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9" gracePeriod=30 Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.745353 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="probe" containerID="cri-o://66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20" gracePeriod=30 Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.746269 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" event={"ID":"06feef03-ccd0-4fa8-8939-852df217d3a1","Type":"ContainerDied","Data":"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05"} Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.746314 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" event={"ID":"06feef03-ccd0-4fa8-8939-852df217d3a1","Type":"ContainerDied","Data":"4dbb609ebaa299a4e43916e94be6fb6bd63d03b345698adc9669f7067825ce47"} Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.746359 4901 scope.go:117] "RemoveContainer" containerID="1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.747000 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f4d4785b9-9m2tl" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9696/\": dial tcp 10.217.0.153:9696: connect: connection refused" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.748041 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-vk7th" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.760288 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf" (OuterVolumeSpecName: "kube-api-access-sssgf") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "kube-api-access-sssgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.782238 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.782876 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.797854 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.800820 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.800852 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sssgf\" (UniqueName: \"kubernetes.io/projected/06feef03-ccd0-4fa8-8939-852df217d3a1-kube-api-access-sssgf\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.800864 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.800874 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.802160 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d89c9dff9-fzvln" podStartSLOduration=2.802138344 podStartE2EDuration="2.802138344s" podCreationTimestamp="2026-02-02 10:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:48.738577045 +0000 UTC m=+1095.756917141" watchObservedRunningTime="2026-02-02 10:56:48.802138344 +0000 UTC m=+1095.820478440" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.821209 4901 scope.go:117] "RemoveContainer" containerID="550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.848844 4901 scope.go:117] "RemoveContainer" containerID="1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05" Feb 02 10:56:48 crc kubenswrapper[4901]: E0202 10:56:48.849906 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05\": container with ID starting with 1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05 not found: ID does not exist" containerID="1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.849949 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05"} err="failed to get container status \"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05\": rpc error: code = NotFound desc = could not find container \"1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05\": container with ID starting with 1abb2ef89c29c1834851bbf1fda361b639650d73cf9a4bab2f82d365be27be05 not found: ID does not exist" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.849972 4901 scope.go:117] "RemoveContainer" containerID="550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f" Feb 02 10:56:48 crc kubenswrapper[4901]: E0202 10:56:48.850243 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f\": container with ID starting with 550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f not found: ID does not exist" containerID="550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.850340 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f"} err="failed to get container status \"550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f\": rpc error: code = NotFound desc = could not find container \"550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f\": container with ID starting with 550b2b7a89c4b023682eb4dc2ca89c79224934f33d161e90f0ecb685dc7b303f not found: ID does not exist" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.853885 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config" (OuterVolumeSpecName: "config") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.867924 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06feef03-ccd0-4fa8-8939-852df217d3a1" (UID: "06feef03-ccd0-4fa8-8939-852df217d3a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.902095 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:48 crc kubenswrapper[4901]: I0202 10:56:48.902127 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06feef03-ccd0-4fa8-8939-852df217d3a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.094610 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.103712 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-vk7th"] Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.690261 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" path="/var/lib/kubelet/pods/06feef03-ccd0-4fa8-8939-852df217d3a1/volumes" Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.690930 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" path="/var/lib/kubelet/pods/aa18cb0b-f1e0-4a57-90e0-daeede8334de/volumes" Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.761704 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerStarted","Data":"755476eee288b322d93ba27f8c554fff2456a8dc59e911a29943ca0ee4d6f28c"} Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.763917 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.770976 4901 generic.go:334] "Generic (PLEG): container finished" podID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerID="66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20" exitCode=0 Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.771087 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerDied","Data":"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20"} Feb 02 10:56:49 crc kubenswrapper[4901]: I0202 10:56:49.796149 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.376825622 podStartE2EDuration="6.796123605s" podCreationTimestamp="2026-02-02 10:56:43 +0000 UTC" firstStartedPulling="2026-02-02 10:56:44.717189566 +0000 UTC m=+1091.735529662" lastFinishedPulling="2026-02-02 10:56:49.136487549 +0000 UTC m=+1096.154827645" observedRunningTime="2026-02-02 10:56:49.790975506 +0000 UTC m=+1096.809315602" watchObservedRunningTime="2026-02-02 10:56:49.796123605 +0000 UTC m=+1096.814463711" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.786989 4901 generic.go:334] "Generic (PLEG): container finished" podID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerID="c370f81bb014213fa8f0ca4747e7a27481b3e911d5436628a3ac6d65de5cab15" exitCode=0 Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.787155 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerDied","Data":"c370f81bb014213fa8f0ca4747e7a27481b3e911d5436628a3ac6d65de5cab15"} Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.788629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f4d4785b9-9m2tl" event={"ID":"b97ea9e5-3049-4ba7-9cc1-2165d15a3746","Type":"ContainerDied","Data":"81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e"} Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.788647 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81359a15d41ddeea8c2510cfc5213576519ac64ba619dd39aa19bc32065ce07e" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.815165 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852522 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852627 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852746 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852812 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gszjm\" (UniqueName: \"kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852933 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.852977 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.853031 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config\") pod \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\" (UID: \"b97ea9e5-3049-4ba7-9cc1-2165d15a3746\") " Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.872621 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm" (OuterVolumeSpecName: "kube-api-access-gszjm") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "kube-api-access-gszjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.882956 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.923964 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.924936 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.944435 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.944935 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config" (OuterVolumeSpecName: "config") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957289 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957349 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957363 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957374 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957403 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.957413 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gszjm\" (UniqueName: \"kubernetes.io/projected/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-kube-api-access-gszjm\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:50 crc kubenswrapper[4901]: I0202 10:56:50.967779 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b97ea9e5-3049-4ba7-9cc1-2165d15a3746" (UID: "b97ea9e5-3049-4ba7-9cc1-2165d15a3746"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:51 crc kubenswrapper[4901]: I0202 10:56:51.060309 4901 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b97ea9e5-3049-4ba7-9cc1-2165d15a3746-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:51 crc kubenswrapper[4901]: I0202 10:56:51.808859 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f4d4785b9-9m2tl" Feb 02 10:56:51 crc kubenswrapper[4901]: I0202 10:56:51.851305 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:51 crc kubenswrapper[4901]: I0202 10:56:51.861501 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f4d4785b9-9m2tl"] Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.649165 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.695410 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" path="/var/lib/kubelet/pods/b97ea9e5-3049-4ba7-9cc1-2165d15a3746/volumes" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.735992 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.737758 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvc6\" (UniqueName: \"kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.738094 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.738217 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.738330 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.738474 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle\") pod \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\" (UID: \"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4\") " Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.738453 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.739333 4901 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.744988 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.745898 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts" (OuterVolumeSpecName: "scripts") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.757660 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6" (OuterVolumeSpecName: "kube-api-access-xqvc6") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "kube-api-access-xqvc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.813865 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.841557 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.841605 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.841616 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.841627 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvc6\" (UniqueName: \"kubernetes.io/projected/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-kube-api-access-xqvc6\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.855556 4901 generic.go:334] "Generic (PLEG): container finished" podID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerID="c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9" exitCode=0 Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.855632 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerDied","Data":"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9"} Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.855670 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ba1dd4f-2e2c-4055-a605-35de3b73bdb4","Type":"ContainerDied","Data":"29ba11d93f1b0fed419c069079f1ddc84687c5cfc7ad03cc1a9207dd77a0d7b3"} Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.855693 4901 scope.go:117] "RemoveContainer" containerID="66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.855877 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.879704 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data" (OuterVolumeSpecName: "config-data") pod "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" (UID: "3ba1dd4f-2e2c-4055-a605-35de3b73bdb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.883820 4901 scope.go:117] "RemoveContainer" containerID="c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.905758 4901 scope.go:117] "RemoveContainer" containerID="66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20" Feb 02 10:56:53 crc kubenswrapper[4901]: E0202 10:56:53.906226 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20\": container with ID starting with 66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20 not found: ID does not exist" containerID="66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.906271 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20"} err="failed to get container status \"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20\": rpc error: code = NotFound desc = could not find container \"66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20\": container with ID starting with 66d06e57344ff5010ff9cfe4e8fe0ca5a65984af2e4f8844d83ca0d6ed9bac20 not found: ID does not exist" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.906298 4901 scope.go:117] "RemoveContainer" containerID="c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9" Feb 02 10:56:53 crc kubenswrapper[4901]: E0202 10:56:53.906866 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9\": container with ID starting with c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9 not found: ID does not exist" containerID="c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.906901 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9"} err="failed to get container status \"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9\": rpc error: code = NotFound desc = could not find container \"c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9\": container with ID starting with c5cd2d85c9b795fc3d1265b4a4a0d1d7088338f2bb34830af938f5add3eeefd9 not found: ID does not exist" Feb 02 10:56:53 crc kubenswrapper[4901]: I0202 10:56:53.942948 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.258700 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.272877 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.284530 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.285625 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.285859 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.286182 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="init" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.286365 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="init" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.286444 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="dnsmasq-dns" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.286497 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="dnsmasq-dns" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.286675 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-api" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.286748 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-api" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.286828 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="cinder-scheduler" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.286932 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="cinder-scheduler" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.287021 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="probe" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.287181 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="probe" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.287264 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.287375 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api" Feb 02 10:56:54 crc kubenswrapper[4901]: E0202 10:56:54.287645 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.287806 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288273 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288475 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="cinder-scheduler" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288638 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa18cb0b-f1e0-4a57-90e0-daeede8334de" containerName="barbican-api-log" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288712 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-api" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288818 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="06feef03-ccd0-4fa8-8939-852df217d3a1" containerName="dnsmasq-dns" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.288906 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" containerName="probe" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.289000 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97ea9e5-3049-4ba7-9cc1-2165d15a3746" containerName="neutron-httpd" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.291750 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.292464 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.296938 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.349935 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.350245 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.350360 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fec3da0b-c2dc-4269-b621-19b380d9b92d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.350484 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.350578 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-scripts\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.350684 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grsdr\" (UniqueName: \"kubernetes.io/projected/fec3da0b-c2dc-4269-b621-19b380d9b92d-kube-api-access-grsdr\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453200 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453288 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fec3da0b-c2dc-4269-b621-19b380d9b92d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453458 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453512 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-scripts\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453541 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grsdr\" (UniqueName: \"kubernetes.io/projected/fec3da0b-c2dc-4269-b621-19b380d9b92d-kube-api-access-grsdr\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.453699 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fec3da0b-c2dc-4269-b621-19b380d9b92d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.459808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-scripts\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.467121 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.470334 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-config-data\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.471159 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec3da0b-c2dc-4269-b621-19b380d9b92d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.471369 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grsdr\" (UniqueName: \"kubernetes.io/projected/fec3da0b-c2dc-4269-b621-19b380d9b92d-kube-api-access-grsdr\") pod \"cinder-scheduler-0\" (UID: \"fec3da0b-c2dc-4269-b621-19b380d9b92d\") " pod="openstack/cinder-scheduler-0" Feb 02 10:56:54 crc kubenswrapper[4901]: I0202 10:56:54.619921 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:56:55 crc kubenswrapper[4901]: I0202 10:56:55.128848 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:56:55 crc kubenswrapper[4901]: I0202 10:56:55.713355 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba1dd4f-2e2c-4055-a605-35de3b73bdb4" path="/var/lib/kubelet/pods/3ba1dd4f-2e2c-4055-a605-35de3b73bdb4/volumes" Feb 02 10:56:55 crc kubenswrapper[4901]: I0202 10:56:55.894839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fec3da0b-c2dc-4269-b621-19b380d9b92d","Type":"ContainerStarted","Data":"de07e6317adedbd6d09abb0b0f025c2b21dff6ae98144d2974bacf666a205b52"} Feb 02 10:56:56 crc kubenswrapper[4901]: I0202 10:56:56.179340 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 10:56:56 crc kubenswrapper[4901]: I0202 10:56:56.907050 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fec3da0b-c2dc-4269-b621-19b380d9b92d","Type":"ContainerStarted","Data":"6150a4a885dca4189ad76297dc56a480763a5133d03ff5f381fbbbf746cc5ed1"} Feb 02 10:56:56 crc kubenswrapper[4901]: I0202 10:56:56.907379 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fec3da0b-c2dc-4269-b621-19b380d9b92d","Type":"ContainerStarted","Data":"8bfd98d657eedcf021b3676c7e3bed4a06ef2f689623b278d7287ad4bce339fd"} Feb 02 10:56:56 crc kubenswrapper[4901]: I0202 10:56:56.937291 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.9372644919999997 podStartE2EDuration="2.937264492s" podCreationTimestamp="2026-02-02 10:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:56.930510933 +0000 UTC m=+1103.948851029" watchObservedRunningTime="2026-02-02 10:56:56.937264492 +0000 UTC m=+1103.955604588" Feb 02 10:56:57 crc kubenswrapper[4901]: I0202 10:56:57.041020 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:57 crc kubenswrapper[4901]: I0202 10:56:57.129477 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.191151 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d7d985874-pzxvf" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.413402 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.456035 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ff445bb86-z9hcr" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.520781 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.924938 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5fd59b56d-fwr5x" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-log" containerID="cri-o://92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689" gracePeriod=30 Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.925239 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5fd59b56d-fwr5x" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-api" containerID="cri-o://7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c" gracePeriod=30 Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.976424 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.977661 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.984446 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-v8d6k" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.984510 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.984751 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 02 10:56:58 crc kubenswrapper[4901]: I0202 10:56:58.999383 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.164704 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.165184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkx2\" (UniqueName: \"kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.165251 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.165276 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.267282 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkx2\" (UniqueName: \"kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.267383 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.267408 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.267509 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.268720 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.281437 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.287799 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkx2\" (UniqueName: \"kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.296011 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.346377 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.620385 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.843398 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:56:59 crc kubenswrapper[4901]: W0202 10:56:59.846525 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707985af_5416_42c1_9fbf_866955d8d1c4.slice/crio-1a370d493f05a9fafe6ef7d8aa5ae1b2736ce7d86128a258c5962ab2e278cf7d WatchSource:0}: Error finding container 1a370d493f05a9fafe6ef7d8aa5ae1b2736ce7d86128a258c5962ab2e278cf7d: Status 404 returned error can't find the container with id 1a370d493f05a9fafe6ef7d8aa5ae1b2736ce7d86128a258c5962ab2e278cf7d Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.936829 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"707985af-5416-42c1-9fbf-866955d8d1c4","Type":"ContainerStarted","Data":"1a370d493f05a9fafe6ef7d8aa5ae1b2736ce7d86128a258c5962ab2e278cf7d"} Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.940754 4901 generic.go:334] "Generic (PLEG): container finished" podID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerID="92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689" exitCode=143 Feb 02 10:56:59 crc kubenswrapper[4901]: I0202 10:56:59.940807 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerDied","Data":"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689"} Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.529311 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646440 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646519 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646597 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646635 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcjmk\" (UniqueName: \"kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646681 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646747 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.646806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data\") pod \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\" (UID: \"46bccada-d49d-4bed-a7c7-e3ea2a64414a\") " Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.648086 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs" (OuterVolumeSpecName: "logs") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.652836 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts" (OuterVolumeSpecName: "scripts") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.659807 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk" (OuterVolumeSpecName: "kube-api-access-xcjmk") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "kube-api-access-xcjmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.766005 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.766040 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46bccada-d49d-4bed-a7c7-e3ea2a64414a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.766052 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcjmk\" (UniqueName: \"kubernetes.io/projected/46bccada-d49d-4bed-a7c7-e3ea2a64414a-kube-api-access-xcjmk\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.776472 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data" (OuterVolumeSpecName: "config-data") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.815088 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:02 crc kubenswrapper[4901]: E0202 10:57:02.815480 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-api" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.815492 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-api" Feb 02 10:57:02 crc kubenswrapper[4901]: E0202 10:57:02.815540 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-log" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.815546 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-log" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.815775 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-api" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.815805 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerName="placement-log" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.816415 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.821573 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bb5r6" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.821905 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.822056 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.841220 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.860695 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867375 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867422 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867461 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867484 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qnr\" (UniqueName: \"kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867594 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.867610 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.871774 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.880645 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "46bccada-d49d-4bed-a7c7-e3ea2a64414a" (UID: "46bccada-d49d-4bed-a7c7-e3ea2a64414a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.953726 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.955497 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.968960 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.968991 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.969039 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.969066 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qnr\" (UniqueName: \"kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.969158 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.969171 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/46bccada-d49d-4bed-a7c7-e3ea2a64414a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.971028 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.972294 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.974907 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.979269 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.981245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.985620 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:02 crc kubenswrapper[4901]: I0202 10:57:02.990985 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.005928 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.017526 4901 generic.go:334] "Generic (PLEG): container finished" podID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" containerID="7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c" exitCode=0 Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.017589 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerDied","Data":"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c"} Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.017623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fd59b56d-fwr5x" event={"ID":"46bccada-d49d-4bed-a7c7-e3ea2a64414a","Type":"ContainerDied","Data":"cd83b9c155cc2efa77b419230cf3db17fa7ffe3db4d8ee0298a64656096e11ae"} Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.017643 4901 scope.go:117] "RemoveContainer" containerID="7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.017859 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fd59b56d-fwr5x" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.018319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qnr\" (UniqueName: \"kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr\") pod \"heat-engine-758cb7689c-5d5jx\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.033357 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.041059 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.045508 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.070871 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.070930 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cffg7\" (UniqueName: \"kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.070950 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.070977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071007 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071044 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071062 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcxt\" (UniqueName: \"kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071085 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071127 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071150 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.071272 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.078387 4901 scope.go:117] "RemoveContainer" containerID="92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.080529 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.090218 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5fd59b56d-fwr5x"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.124044 4901 scope.go:117] "RemoveContainer" containerID="7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c" Feb 02 10:57:03 crc kubenswrapper[4901]: E0202 10:57:03.125125 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c\": container with ID starting with 7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c not found: ID does not exist" containerID="7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.125155 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c"} err="failed to get container status \"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c\": rpc error: code = NotFound desc = could not find container \"7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c\": container with ID starting with 7e48bf1328dc8a5967b5be08ecf501d44f4690f79365d610dff488421d28d45c not found: ID does not exist" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.125176 4901 scope.go:117] "RemoveContainer" containerID="92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689" Feb 02 10:57:03 crc kubenswrapper[4901]: E0202 10:57:03.134741 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689\": container with ID starting with 92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689 not found: ID does not exist" containerID="92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.134809 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689"} err="failed to get container status \"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689\": rpc error: code = NotFound desc = could not find container \"92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689\": container with ID starting with 92f4b64beca26fb4cbf6f403b155919544c80c90fc783ff5fda303528cd05689 not found: ID does not exist" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.180147 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181834 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181892 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181923 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cffg7\" (UniqueName: \"kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181941 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181967 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.181994 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182013 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182050 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182072 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcxt\" (UniqueName: \"kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182095 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182125 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182147 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182163 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.182192 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrhl\" (UniqueName: \"kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.185431 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.185871 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.186629 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.189935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.190707 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.222063 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.222217 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.224309 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.234810 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcxt\" (UniqueName: \"kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt\") pod \"dnsmasq-dns-688b9f5b49-6kjl6\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.244868 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cffg7\" (UniqueName: \"kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7\") pod \"heat-cfnapi-55bf54c5c5-ttz4h\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.288824 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.288889 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrhl\" (UniqueName: \"kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.288961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.289033 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.301999 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.304331 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.320851 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.321299 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.335975 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.355380 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrhl\" (UniqueName: \"kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl\") pod \"heat-api-5cf5f57d85-xv6g6\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.400904 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.692556 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bccada-d49d-4bed-a7c7-e3ea2a64414a" path="/var/lib/kubelet/pods/46bccada-d49d-4bed-a7c7-e3ea2a64414a/volumes" Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.942970 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:03 crc kubenswrapper[4901]: I0202 10:57:03.966299 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:57:03 crc kubenswrapper[4901]: W0202 10:57:03.977373 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399febbb_a010_490d_bdf3_f80f74257dea.slice/crio-f93b17c8e476a80d37d55d09349d4f9c5461171b30a8661e4a574d07dd8107e4 WatchSource:0}: Error finding container f93b17c8e476a80d37d55d09349d4f9c5461171b30a8661e4a574d07dd8107e4: Status 404 returned error can't find the container with id f93b17c8e476a80d37d55d09349d4f9c5461171b30a8661e4a574d07dd8107e4 Feb 02 10:57:04 crc kubenswrapper[4901]: I0202 10:57:04.047344 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" event={"ID":"399febbb-a010-490d-bdf3-f80f74257dea","Type":"ContainerStarted","Data":"f93b17c8e476a80d37d55d09349d4f9c5461171b30a8661e4a574d07dd8107e4"} Feb 02 10:57:04 crc kubenswrapper[4901]: I0202 10:57:04.053017 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-758cb7689c-5d5jx" event={"ID":"9729bd04-c205-4f62-b74c-92df193ad13e","Type":"ContainerStarted","Data":"87bec3ee032cb978e46d257e4f32d4f408e321ee45f4db3e888e49e9bf7663de"} Feb 02 10:57:04 crc kubenswrapper[4901]: I0202 10:57:04.111919 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:04 crc kubenswrapper[4901]: I0202 10:57:04.119460 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:04 crc kubenswrapper[4901]: I0202 10:57:04.893588 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.076881 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cf5f57d85-xv6g6" event={"ID":"169d32ef-3d4c-4f19-861a-afbc638d72df","Type":"ContainerStarted","Data":"fd142998a0eca5d50b2ff238658e4325174d7f7413fee5b10c5b16ec59bc644f"} Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.078679 4901 generic.go:334] "Generic (PLEG): container finished" podID="399febbb-a010-490d-bdf3-f80f74257dea" containerID="fe2e0524953aedfb2e4fda3b3a33d4737c87c6ed3d490487cc364b1efddaa3e2" exitCode=0 Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.078736 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" event={"ID":"399febbb-a010-490d-bdf3-f80f74257dea","Type":"ContainerDied","Data":"fe2e0524953aedfb2e4fda3b3a33d4737c87c6ed3d490487cc364b1efddaa3e2"} Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.096895 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-758cb7689c-5d5jx" event={"ID":"9729bd04-c205-4f62-b74c-92df193ad13e","Type":"ContainerStarted","Data":"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561"} Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.098263 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.099065 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" event={"ID":"c747aeb2-de75-49ae-b04e-c2e1cd27b77d","Type":"ContainerStarted","Data":"78321197b0942735b01b21375c95a36ef6b388732e014334331cc9a7a1296269"} Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.162138 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-758cb7689c-5d5jx" podStartSLOduration=3.161020545 podStartE2EDuration="3.161020545s" podCreationTimestamp="2026-02-02 10:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:05.147372295 +0000 UTC m=+1112.165712391" watchObservedRunningTime="2026-02-02 10:57:05.161020545 +0000 UTC m=+1112.179360641" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.253698 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.254811 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-central-agent" containerID="cri-o://007aa571a93db60a7baf10afed2a0b4aba84468020f47a78b334dba3f481bcaa" gracePeriod=30 Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.256049 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="proxy-httpd" containerID="cri-o://755476eee288b322d93ba27f8c554fff2456a8dc59e911a29943ca0ee4d6f28c" gracePeriod=30 Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.256672 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-notification-agent" containerID="cri-o://31145ddf5c2b8a55b22db930db9ae275fd40ebe87f6da75bfe738943ad0660a7" gracePeriod=30 Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.256743 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="sg-core" containerID="cri-o://137cd9c17ee4ac57c1c302e50c86191b1f409a6540089c72fce7948326623312" gracePeriod=30 Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.273021 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.477176 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-96f76df57-tdlmx"] Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.479592 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.482963 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.483160 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.483286 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.498026 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-96f76df57-tdlmx"] Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553149 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-log-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553252 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-internal-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553273 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jd2c\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-kube-api-access-9jd2c\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553297 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-combined-ca-bundle\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553336 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-run-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-config-data\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-etc-swift\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.553425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-public-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.654362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-etc-swift\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.654423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-public-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.654489 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-log-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.654537 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-internal-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.654575 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jd2c\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-kube-api-access-9jd2c\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.655264 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-combined-ca-bundle\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.655312 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-run-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.655332 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-config-data\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.655614 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-log-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.655863 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d2b7428-ac02-4aed-8a90-30cc198e4cca-run-httpd\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.661044 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-public-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.661098 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-internal-tls-certs\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.662703 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-etc-swift\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.662853 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-config-data\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.664988 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d2b7428-ac02-4aed-8a90-30cc198e4cca-combined-ca-bundle\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.669342 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jd2c\" (UniqueName: \"kubernetes.io/projected/4d2b7428-ac02-4aed-8a90-30cc198e4cca-kube-api-access-9jd2c\") pod \"swift-proxy-96f76df57-tdlmx\" (UID: \"4d2b7428-ac02-4aed-8a90-30cc198e4cca\") " pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:05 crc kubenswrapper[4901]: I0202 10:57:05.815581 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.118731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" event={"ID":"399febbb-a010-490d-bdf3-f80f74257dea","Type":"ContainerStarted","Data":"fbe779a7fab448eab02904d83b4382a2af4150de16403dce77b56b73af504f09"} Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.119209 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124015 4901 generic.go:334] "Generic (PLEG): container finished" podID="1ed6baff-0886-4acb-b063-5807ec75d169" containerID="755476eee288b322d93ba27f8c554fff2456a8dc59e911a29943ca0ee4d6f28c" exitCode=0 Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124053 4901 generic.go:334] "Generic (PLEG): container finished" podID="1ed6baff-0886-4acb-b063-5807ec75d169" containerID="137cd9c17ee4ac57c1c302e50c86191b1f409a6540089c72fce7948326623312" exitCode=2 Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124066 4901 generic.go:334] "Generic (PLEG): container finished" podID="1ed6baff-0886-4acb-b063-5807ec75d169" containerID="007aa571a93db60a7baf10afed2a0b4aba84468020f47a78b334dba3f481bcaa" exitCode=0 Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124109 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerDied","Data":"755476eee288b322d93ba27f8c554fff2456a8dc59e911a29943ca0ee4d6f28c"} Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124166 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerDied","Data":"137cd9c17ee4ac57c1c302e50c86191b1f409a6540089c72fce7948326623312"} Feb 02 10:57:06 crc kubenswrapper[4901]: I0202 10:57:06.124178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerDied","Data":"007aa571a93db60a7baf10afed2a0b4aba84468020f47a78b334dba3f481bcaa"} Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.132827 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" podStartSLOduration=7.132811316 podStartE2EDuration="7.132811316s" podCreationTimestamp="2026-02-02 10:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:06.141756155 +0000 UTC m=+1113.160096251" watchObservedRunningTime="2026-02-02 10:57:09.132811316 +0000 UTC m=+1116.151151412" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.140335 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.140612 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-log" containerID="cri-o://e776cbe14f8d58e99e75781ca4a73a8e901109010ae260a63f523ec58fe89672" gracePeriod=30 Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.140816 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-httpd" containerID="cri-o://f43ac399bb39abd5da0adec25c3d3c907dd420250f8571799f416809bbc2ad9d" gracePeriod=30 Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.172545 4901 generic.go:334] "Generic (PLEG): container finished" podID="1ed6baff-0886-4acb-b063-5807ec75d169" containerID="31145ddf5c2b8a55b22db930db9ae275fd40ebe87f6da75bfe738943ad0660a7" exitCode=0 Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.172604 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerDied","Data":"31145ddf5c2b8a55b22db930db9ae275fd40ebe87f6da75bfe738943ad0660a7"} Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.578936 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79dbc4cd68-ltht2"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.580429 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.592794 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79dbc4cd68-ltht2"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.605655 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.606933 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.637299 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwh4\" (UniqueName: \"kubernetes.io/projected/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-kube-api-access-krwh4\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.637346 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.637386 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data-custom\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.637412 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-combined-ca-bundle\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.641824 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.641877 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.641984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.642135 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwsq\" (UniqueName: \"kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.649100 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.669522 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.671142 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.745864 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746671 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746734 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746774 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746910 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.746952 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.747303 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.749672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwsq\" (UniqueName: \"kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.752464 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwh4\" (UniqueName: \"kubernetes.io/projected/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-kube-api-access-krwh4\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.752554 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.752853 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data-custom\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.752933 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-combined-ca-bundle\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.757090 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.771935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.773458 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-combined-ca-bundle\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.775550 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.777381 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-config-data-custom\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.777509 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.780511 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwh4\" (UniqueName: \"kubernetes.io/projected/b47d3039-22e1-42c8-b23f-9c5f6dcb51f6-kube-api-access-krwh4\") pod \"heat-engine-79dbc4cd68-ltht2\" (UID: \"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6\") " pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.787519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwsq\" (UniqueName: \"kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq\") pod \"heat-api-54947664d6-2bv6x\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.855521 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.855589 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.855632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.855662 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.860752 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.862796 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.868339 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.879691 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl\") pod \"heat-cfnapi-5d675845dc-qtbq4\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.900634 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:09 crc kubenswrapper[4901]: I0202 10:57:09.937319 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.031915 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.188090 4901 generic.go:334] "Generic (PLEG): container finished" podID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerID="e776cbe14f8d58e99e75781ca4a73a8e901109010ae260a63f523ec58fe89672" exitCode=143 Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.188157 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerDied","Data":"e776cbe14f8d58e99e75781ca4a73a8e901109010ae260a63f523ec58fe89672"} Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.497256 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9ddr2"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.499611 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.530428 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.530722 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-log" containerID="cri-o://f4e44261b0cf9b669b7896d5e2708bb47ca0f96ebd2f61c10f9eebe8dc79943a" gracePeriod=30 Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.530810 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-httpd" containerID="cri-o://033e9456d7db7660d77f59316f11777e03f34259ff92a9d12a1750b1d52321c7" gracePeriod=30 Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.545253 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9ddr2"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.567999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfqkv\" (UniqueName: \"kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.568225 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.670173 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.670518 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfqkv\" (UniqueName: \"kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.671461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.675174 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tgmtn"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.676548 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.697049 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6237-account-create-update-fdqwt"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.698861 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.702133 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.725041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfqkv\" (UniqueName: \"kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv\") pod \"nova-api-db-create-9ddr2\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.726634 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgmtn"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.770404 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6237-account-create-update-fdqwt"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.771745 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87wfb\" (UniqueName: \"kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.771842 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.771887 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6cc\" (UniqueName: \"kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.771999 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.832145 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.849474 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sbl76"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.850729 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874341 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnsx\" (UniqueName: \"kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874434 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874475 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87wfb\" (UniqueName: \"kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874540 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874589 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.874624 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6cc\" (UniqueName: \"kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.875766 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.875840 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.888221 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sbl76"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.895987 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87wfb\" (UniqueName: \"kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb\") pod \"nova-cell0-db-create-tgmtn\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.899192 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6cc\" (UniqueName: \"kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc\") pod \"nova-api-6237-account-create-update-fdqwt\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.961219 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2351-account-create-update-q5sln"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.964016 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.968079 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.976750 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7vn\" (UniqueName: \"kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.977112 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnsx\" (UniqueName: \"kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.977402 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.977466 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.978545 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.980757 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2351-account-create-update-q5sln"] Feb 02 10:57:10 crc kubenswrapper[4901]: I0202 10:57:10.997988 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.011049 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnsx\" (UniqueName: \"kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx\") pod \"nova-cell1-db-create-sbl76\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.065140 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-680b-account-create-update-fxm6v"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.066587 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.067452 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.070318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.083240 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-680b-account-create-update-fxm6v"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.087219 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7vn\" (UniqueName: \"kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.087310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.087423 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsm8g\" (UniqueName: \"kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.087820 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.098275 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.117546 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7vn\" (UniqueName: \"kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn\") pod \"nova-cell0-2351-account-create-update-q5sln\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.170331 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.199959 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.200485 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsm8g\" (UniqueName: \"kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.202827 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.216645 4901 generic.go:334] "Generic (PLEG): container finished" podID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerID="f4e44261b0cf9b669b7896d5e2708bb47ca0f96ebd2f61c10f9eebe8dc79943a" exitCode=143 Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.216956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerDied","Data":"f4e44261b0cf9b669b7896d5e2708bb47ca0f96ebd2f61c10f9eebe8dc79943a"} Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.253012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsm8g\" (UniqueName: \"kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g\") pod \"nova-cell1-680b-account-create-update-fxm6v\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.378629 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.388067 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.404959 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.423546 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6548466b85-x76qz"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.425204 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.428354 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.428666 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.441112 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.461506 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6548466b85-x76qz"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.494727 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7dcd8c5f77-l2jdd"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.496729 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.499592 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.499787 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.516690 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-combined-ca-bundle\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.516758 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.516790 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data-custom\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.516843 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-internal-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.517003 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgkv4\" (UniqueName: \"kubernetes.io/projected/5acf34a1-ce29-4301-bd5e-e6792dff572d-kube-api-access-hgkv4\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.517047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-public-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.524688 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dcd8c5f77-l2jdd"] Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619484 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-public-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619576 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-internal-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619606 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619636 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-combined-ca-bundle\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619666 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-public-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619714 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrs7\" (UniqueName: \"kubernetes.io/projected/ed964b8c-458f-4b9f-8363-55627008bc75-kube-api-access-ncrs7\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619745 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-combined-ca-bundle\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619771 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data-custom\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619813 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-internal-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619864 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data-custom\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.619916 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgkv4\" (UniqueName: \"kubernetes.io/projected/5acf34a1-ce29-4301-bd5e-e6792dff572d-kube-api-access-hgkv4\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.625251 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-combined-ca-bundle\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.625750 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-public-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.626887 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data-custom\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.627309 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-internal-tls-certs\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.633005 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acf34a1-ce29-4301-bd5e-e6792dff572d-config-data\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.638938 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgkv4\" (UniqueName: \"kubernetes.io/projected/5acf34a1-ce29-4301-bd5e-e6792dff572d-kube-api-access-hgkv4\") pod \"heat-api-6548466b85-x76qz\" (UID: \"5acf34a1-ce29-4301-bd5e-e6792dff572d\") " pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723391 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-internal-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723461 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723514 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-combined-ca-bundle\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723581 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-public-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723646 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrs7\" (UniqueName: \"kubernetes.io/projected/ed964b8c-458f-4b9f-8363-55627008bc75-kube-api-access-ncrs7\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.723778 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data-custom\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.728722 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data-custom\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.731124 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-combined-ca-bundle\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.731794 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-public-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.734410 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-config-data\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.734892 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed964b8c-458f-4b9f-8363-55627008bc75-internal-tls-certs\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.749112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrs7\" (UniqueName: \"kubernetes.io/projected/ed964b8c-458f-4b9f-8363-55627008bc75-kube-api-access-ncrs7\") pod \"heat-cfnapi-7dcd8c5f77-l2jdd\" (UID: \"ed964b8c-458f-4b9f-8363-55627008bc75\") " pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.753214 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:11 crc kubenswrapper[4901]: I0202 10:57:11.818343 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:13 crc kubenswrapper[4901]: I0202 10:57:13.245595 4901 generic.go:334] "Generic (PLEG): container finished" podID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerID="f43ac399bb39abd5da0adec25c3d3c907dd420250f8571799f416809bbc2ad9d" exitCode=0 Feb 02 10:57:13 crc kubenswrapper[4901]: I0202 10:57:13.245647 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerDied","Data":"f43ac399bb39abd5da0adec25c3d3c907dd420250f8571799f416809bbc2ad9d"} Feb 02 10:57:13 crc kubenswrapper[4901]: I0202 10:57:13.323777 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:57:13 crc kubenswrapper[4901]: I0202 10:57:13.381674 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:57:13 crc kubenswrapper[4901]: I0202 10:57:13.382252 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="dnsmasq-dns" containerID="cri-o://2bfe02d678cf7cf16745542b57b43a79e00948d372eece84ee5ee46ad2cb9eba" gracePeriod=10 Feb 02 10:57:14 crc kubenswrapper[4901]: I0202 10:57:14.145702 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.167:3000/\": dial tcp 10.217.0.167:3000: connect: connection refused" Feb 02 10:57:14 crc kubenswrapper[4901]: I0202 10:57:14.260245 4901 generic.go:334] "Generic (PLEG): container finished" podID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerID="033e9456d7db7660d77f59316f11777e03f34259ff92a9d12a1750b1d52321c7" exitCode=0 Feb 02 10:57:14 crc kubenswrapper[4901]: I0202 10:57:14.260360 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerDied","Data":"033e9456d7db7660d77f59316f11777e03f34259ff92a9d12a1750b1d52321c7"} Feb 02 10:57:14 crc kubenswrapper[4901]: I0202 10:57:14.262418 4901 generic.go:334] "Generic (PLEG): container finished" podID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerID="2bfe02d678cf7cf16745542b57b43a79e00948d372eece84ee5ee46ad2cb9eba" exitCode=0 Feb 02 10:57:14 crc kubenswrapper[4901]: I0202 10:57:14.262505 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" event={"ID":"5e62ea67-c139-4682-8563-441d8b7aeae6","Type":"ContainerDied","Data":"2bfe02d678cf7cf16745542b57b43a79e00948d372eece84ee5ee46ad2cb9eba"} Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.100852 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.228254 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.228775 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmh4\" (UniqueName: \"kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.228819 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.229021 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.229050 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.229072 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc\") pod \"5e62ea67-c139-4682-8563-441d8b7aeae6\" (UID: \"5e62ea67-c139-4682-8563-441d8b7aeae6\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.330110 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4" (OuterVolumeSpecName: "kube-api-access-lpmh4") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "kube-api-access-lpmh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.330882 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmh4\" (UniqueName: \"kubernetes.io/projected/5e62ea67-c139-4682-8563-441d8b7aeae6-kube-api-access-lpmh4\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.352223 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.363656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" event={"ID":"5e62ea67-c139-4682-8563-441d8b7aeae6","Type":"ContainerDied","Data":"de170f5b032031e776d168887f1e1aca9cebb587a962e7c88596ea7f12ba23b6"} Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.363714 4901 scope.go:117] "RemoveContainer" containerID="2bfe02d678cf7cf16745542b57b43a79e00948d372eece84ee5ee46ad2cb9eba" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.363884 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-pt7c2" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.377986 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ed6baff-0886-4acb-b063-5807ec75d169","Type":"ContainerDied","Data":"4375ab0577a3d8d217d6433ead090f3707ba287695e865de12354c4ea198567f"} Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.378023 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4375ab0577a3d8d217d6433ead090f3707ba287695e865de12354c4ea198567f" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.379851 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432252 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432389 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432485 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432576 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432701 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.432738 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6c58\" (UniqueName: \"kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58\") pod \"1ed6baff-0886-4acb-b063-5807ec75d169\" (UID: \"1ed6baff-0886-4acb-b063-5807ec75d169\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.433641 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.438822 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.441248 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.454774 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58" (OuterVolumeSpecName: "kube-api-access-b6c58") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "kube-api-access-b6c58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.467511 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts" (OuterVolumeSpecName: "scripts") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.492936 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.520924 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.524363 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.532818 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config" (OuterVolumeSpecName: "config") pod "5e62ea67-c139-4682-8563-441d8b7aeae6" (UID: "5e62ea67-c139-4682-8563-441d8b7aeae6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539890 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6c58\" (UniqueName: \"kubernetes.io/projected/1ed6baff-0886-4acb-b063-5807ec75d169-kube-api-access-b6c58\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539922 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539935 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539944 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539954 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539964 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed6baff-0886-4acb-b063-5807ec75d169-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539972 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.539982 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e62ea67-c139-4682-8563-441d8b7aeae6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.583610 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.648422 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.729697 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data" (OuterVolumeSpecName: "config-data") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.764506 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.765395 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed6baff-0886-4acb-b063-5807ec75d169" (UID: "1ed6baff-0886-4acb-b063-5807ec75d169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.776369 4901 scope.go:117] "RemoveContainer" containerID="ae1c08757c419f8ea11e1ae886b36f800e4b8399f50119859178549866040f24" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.795266 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.826180 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.856630 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-pt7c2"] Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.866289 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.867501 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.867547 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.867875 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.867975 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.868023 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.868070 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.868088 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgrd\" (UniqueName: \"kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd\") pod \"bcb9c059-9f39-4478-9b04-9417b05f4bef\" (UID: \"bcb9c059-9f39-4478-9b04-9417b05f4bef\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.868667 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed6baff-0886-4acb-b063-5807ec75d169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.866956 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs" (OuterVolumeSpecName: "logs") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.869523 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.878448 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd" (OuterVolumeSpecName: "kube-api-access-6cgrd") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "kube-api-access-6cgrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.887636 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.892479 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts" (OuterVolumeSpecName: "scripts") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.896678 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.936872 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970548 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfq4v\" (UniqueName: \"kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970732 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970782 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970827 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970872 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970896 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.970934 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle\") pod \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\" (UID: \"e2c6c553-9669-4fc0-a72b-9a528764e7a8\") " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971361 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971375 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971385 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cgrd\" (UniqueName: \"kubernetes.io/projected/bcb9c059-9f39-4478-9b04-9417b05f4bef-kube-api-access-6cgrd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971396 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971417 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.971427 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bcb9c059-9f39-4478-9b04-9417b05f4bef-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.977636 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.979976 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs" (OuterVolumeSpecName: "logs") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:15 crc kubenswrapper[4901]: I0202 10:57:15.992179 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.001618 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.001658 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts" (OuterVolumeSpecName: "scripts") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.007145 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.018010 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v" (OuterVolumeSpecName: "kube-api-access-dfq4v") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "kube-api-access-dfq4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.043462 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.052770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data" (OuterVolumeSpecName: "config-data") pod "bcb9c059-9f39-4478-9b04-9417b05f4bef" (UID: "bcb9c059-9f39-4478-9b04-9417b05f4bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.071145 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073172 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073232 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073245 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2c6c553-9669-4fc0-a72b-9a528764e7a8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073257 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073266 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073296 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073307 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073317 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073328 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfq4v\" (UniqueName: \"kubernetes.io/projected/e2c6c553-9669-4fc0-a72b-9a528764e7a8-kube-api-access-dfq4v\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.073342 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcb9c059-9f39-4478-9b04-9417b05f4bef-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.077172 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data" (OuterVolumeSpecName: "config-data") pod "e2c6c553-9669-4fc0-a72b-9a528764e7a8" (UID: "e2c6c553-9669-4fc0-a72b-9a528764e7a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.097258 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.177480 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c6c553-9669-4fc0-a72b-9a528764e7a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.177524 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.228082 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6237-account-create-update-fdqwt"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.329457 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-96f76df57-tdlmx"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.425004 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cf5f57d85-xv6g6" event={"ID":"169d32ef-3d4c-4f19-861a-afbc638d72df","Type":"ContainerStarted","Data":"ae7f807d1b63b3eedbe8e20fffe0122a2213dc96dbc663b8ffbc58a4d4ad4f2e"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.425580 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5cf5f57d85-xv6g6" podUID="169d32ef-3d4c-4f19-861a-afbc638d72df" containerName="heat-api" containerID="cri-o://ae7f807d1b63b3eedbe8e20fffe0122a2213dc96dbc663b8ffbc58a4d4ad4f2e" gracePeriod=60 Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.426025 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.439861 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6237-account-create-update-fdqwt" event={"ID":"24d314f1-b9e6-4e62-a599-533cf470eea2","Type":"ContainerStarted","Data":"9ad02b862d3dbb0bc1a115819634cf969a04266cdc0966595d8186ba85588656"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.441610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-96f76df57-tdlmx" event={"ID":"4d2b7428-ac02-4aed-8a90-30cc198e4cca","Type":"ContainerStarted","Data":"94ee848fa15661c7f080b0338ef8c5f1737c32ee477e71db001dc9e385e5bedd"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.478299 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bcb9c059-9f39-4478-9b04-9417b05f4bef","Type":"ContainerDied","Data":"c2a2b6b73485e9f6de28645972572f5e5d598453016450f03f4fad0a8a13eee5"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.478380 4901 scope.go:117] "RemoveContainer" containerID="033e9456d7db7660d77f59316f11777e03f34259ff92a9d12a1750b1d52321c7" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.478545 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.481158 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5cf5f57d85-xv6g6" podStartSLOduration=4.026280029 podStartE2EDuration="14.481142811s" podCreationTimestamp="2026-02-02 10:57:02 +0000 UTC" firstStartedPulling="2026-02-02 10:57:04.125925037 +0000 UTC m=+1111.144265133" lastFinishedPulling="2026-02-02 10:57:14.580787819 +0000 UTC m=+1121.599127915" observedRunningTime="2026-02-02 10:57:16.447275445 +0000 UTC m=+1123.465615541" watchObservedRunningTime="2026-02-02 10:57:16.481142811 +0000 UTC m=+1123.499482907" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.495297 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"707985af-5416-42c1-9fbf-866955d8d1c4","Type":"ContainerStarted","Data":"d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.516584 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.682115253 podStartE2EDuration="18.516556517s" podCreationTimestamp="2026-02-02 10:56:58 +0000 UTC" firstStartedPulling="2026-02-02 10:56:59.849995805 +0000 UTC m=+1106.868335901" lastFinishedPulling="2026-02-02 10:57:14.684437069 +0000 UTC m=+1121.702777165" observedRunningTime="2026-02-02 10:57:16.512231369 +0000 UTC m=+1123.530571485" watchObservedRunningTime="2026-02-02 10:57:16.516556517 +0000 UTC m=+1123.534896603" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.527160 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e2c6c553-9669-4fc0-a72b-9a528764e7a8","Type":"ContainerDied","Data":"2f04a886f4dddb757f789beba032ee796d8413044da9768068034f3e83f7a2bb"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.527177 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.529774 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.533236 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" event={"ID":"c747aeb2-de75-49ae-b04e-c2e1cd27b77d","Type":"ContainerStarted","Data":"86e46be019fa316f9871af43167766ec6e02b7f1313dfdfc1afcf183f85daa66"} Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.533388 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" podUID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" containerName="heat-cfnapi" containerID="cri-o://86e46be019fa316f9871af43167766ec6e02b7f1313dfdfc1afcf183f85daa66" gracePeriod=60 Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.533687 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.554152 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" podStartSLOduration=4.209659681 podStartE2EDuration="14.554134605s" podCreationTimestamp="2026-02-02 10:57:02 +0000 UTC" firstStartedPulling="2026-02-02 10:57:04.1196503 +0000 UTC m=+1111.137990396" lastFinishedPulling="2026-02-02 10:57:14.464125224 +0000 UTC m=+1121.482465320" observedRunningTime="2026-02-02 10:57:16.552239858 +0000 UTC m=+1123.570579974" watchObservedRunningTime="2026-02-02 10:57:16.554134605 +0000 UTC m=+1123.572474701" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.610370 4901 scope.go:117] "RemoveContainer" containerID="f4e44261b0cf9b669b7896d5e2708bb47ca0f96ebd2f61c10f9eebe8dc79943a" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.637387 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.707524 4901 scope.go:117] "RemoveContainer" containerID="f43ac399bb39abd5da0adec25c3d3c907dd420250f8571799f416809bbc2ad9d" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.727348 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.754222 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.755301 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="dnsmasq-dns" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762733 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="dnsmasq-dns" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762775 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762783 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762796 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762802 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762817 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="sg-core" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762822 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="sg-core" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762835 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762840 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762850 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-notification-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762860 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-notification-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762880 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-central-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762886 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-central-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762894 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="proxy-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762900 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="proxy-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762905 4901 scope.go:117] "RemoveContainer" containerID="e776cbe14f8d58e99e75781ca4a73a8e901109010ae260a63f523ec58fe89672" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762939 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.762983 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: E0202 10:57:16.762995 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="init" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763001 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="init" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763349 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763361 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-log" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763370 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763385 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" containerName="dnsmasq-dns" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763393 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-central-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763413 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="proxy-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763426 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="ceilometer-notification-agent" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763438 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" containerName="glance-httpd" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.763445 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" containerName="sg-core" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.766131 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.768822 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.770063 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.770117 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.805640 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.806961 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807033 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807079 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5k7x\" (UniqueName: \"kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807106 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807126 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807170 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.807218 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: W0202 10:57:16.818739 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5acf34a1_ce29_4301_bd5e_e6792dff572d.slice/crio-3a7fa64c0e49bbac1413b61170e4a047a249ac795a7782b9ae17a1c0bdb200c9 WatchSource:0}: Error finding container 3a7fa64c0e49bbac1413b61170e4a047a249ac795a7782b9ae17a1c0bdb200c9: Status 404 returned error can't find the container with id 3a7fa64c0e49bbac1413b61170e4a047a249ac795a7782b9ae17a1c0bdb200c9 Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.826712 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.872776 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.875189 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.881784 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.883090 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.883522 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.883550 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dqjmw" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.884061 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.886067 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d89c9dff9-fzvln" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.894394 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909612 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909658 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5k7x\" (UniqueName: \"kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909692 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909713 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.909752 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.912363 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.913572 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.913898 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.927481 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.929273 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.933367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.933425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.934062 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.934336 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.934720 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.935301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.948844 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.953030 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6548466b85-x76qz"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.963147 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79dbc4cd68-ltht2"] Feb 02 10:57:16 crc kubenswrapper[4901]: I0202 10:57:16.990056 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5k7x\" (UniqueName: \"kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x\") pod \"ceilometer-0\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " pod="openstack/ceilometer-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.011451 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.013874 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.013905 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.013954 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-scripts\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.013977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-config-data\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.014047 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-logs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.014067 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.014106 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrrsc\" (UniqueName: \"kubernetes.io/projected/5731a76d-bd25-4a51-acfc-7dfd031eef35-kube-api-access-wrrsc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116533 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116630 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116675 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkbm\" (UniqueName: \"kubernetes.io/projected/4d303ae2-764c-42f1-afaa-d099c91b5ac4-kube-api-access-bqkbm\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116698 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-scripts\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116729 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-config-data\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116792 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116822 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-logs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116851 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116870 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116906 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrrsc\" (UniqueName: \"kubernetes.io/projected/5731a76d-bd25-4a51-acfc-7dfd031eef35-kube-api-access-wrrsc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116954 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.116990 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.126152 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.126209 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.126257 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.126286 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.127274 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.128320 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.129131 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5731a76d-bd25-4a51-acfc-7dfd031eef35-logs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.139282 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.144788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.146870 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-config-data\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.150587 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5731a76d-bd25-4a51-acfc-7dfd031eef35-scripts\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.154902 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.162006 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrrsc\" (UniqueName: \"kubernetes.io/projected/5731a76d-bd25-4a51-acfc-7dfd031eef35-kube-api-access-wrrsc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.214515 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.214771 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f784b5584-t7x4s" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-api" containerID="cri-o://299510ae2a67239f6106a2c52c9143358ea95db027b4490f1c10b74d9ffaa9c6" gracePeriod=30 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.215168 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f784b5584-t7x4s" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-httpd" containerID="cri-o://2c8f2fdc80d2f7dcfdbc3cb555e2560c637c39277353ad397d669d7633a69a74" gracePeriod=30 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234228 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234271 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234296 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234315 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234380 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkbm\" (UniqueName: \"kubernetes.io/projected/4d303ae2-764c-42f1-afaa-d099c91b5ac4-kube-api-access-bqkbm\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234452 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234481 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234548 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.234981 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.248644 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.248879 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d303ae2-764c-42f1-afaa-d099c91b5ac4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.272616 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.280995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"5731a76d-bd25-4a51-acfc-7dfd031eef35\") " pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.281052 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.281968 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.282479 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tgmtn"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.286565 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d303ae2-764c-42f1-afaa-d099c91b5ac4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.291818 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkbm\" (UniqueName: \"kubernetes.io/projected/4d303ae2-764c-42f1-afaa-d099c91b5ac4-kube-api-access-bqkbm\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.297807 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2351-account-create-update-q5sln"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.314455 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dcd8c5f77-l2jdd"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.324644 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.325287 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.340125 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sbl76"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.349205 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-680b-account-create-update-fxm6v"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.378102 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"4d303ae2-764c-42f1-afaa-d099c91b5ac4\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.378739 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9ddr2"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.401969 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.410323 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.568847 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9ddr2" event={"ID":"80f838bb-a2da-496c-9338-50c97222d215","Type":"ContainerStarted","Data":"47457ed89ba3a5708b9b8f725cd3f08dd9601a874c1dafe5a6eb2c168ee73015"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.576489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79dbc4cd68-ltht2" event={"ID":"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6","Type":"ContainerStarted","Data":"fd226cd27e24544524e26c944b7231b2be7d443e2ca4585a6eb74f6b2726df0e"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.576543 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79dbc4cd68-ltht2" event={"ID":"b47d3039-22e1-42c8-b23f-9c5f6dcb51f6","Type":"ContainerStarted","Data":"8f312801a381af4d93a22c07e22fcbab1f152ef6581919cb32ae74b1216ce92a"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.579049 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.605343 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79dbc4cd68-ltht2" podStartSLOduration=8.605324895999999 podStartE2EDuration="8.605324896s" podCreationTimestamp="2026-02-02 10:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:17.59905788 +0000 UTC m=+1124.617397976" watchObservedRunningTime="2026-02-02 10:57:17.605324896 +0000 UTC m=+1124.623664992" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.638670 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54947664d6-2bv6x" event={"ID":"fdf4e65f-ea75-489a-9183-e0a9f290345e","Type":"ContainerStarted","Data":"8b4f7de9b227163c36b2945db793c2a57e2993614c5001d3dad833f2de0b794d"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.648730 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6548466b85-x76qz" event={"ID":"5acf34a1-ce29-4301-bd5e-e6792dff572d","Type":"ContainerStarted","Data":"3a7fa64c0e49bbac1413b61170e4a047a249ac795a7782b9ae17a1c0bdb200c9"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.654097 4901 generic.go:334] "Generic (PLEG): container finished" podID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerID="2c8f2fdc80d2f7dcfdbc3cb555e2560c637c39277353ad397d669d7633a69a74" exitCode=0 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.654190 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerDied","Data":"2c8f2fdc80d2f7dcfdbc3cb555e2560c637c39277353ad397d669d7633a69a74"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.657184 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" event={"ID":"d86c5f78-c1ef-4b91-b731-3b85c4fab45a","Type":"ContainerStarted","Data":"92948b8d79084b86a6841347549820d9a90bf324e36e9fcf5c0f17d6de2c2519"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.669080 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbl76" event={"ID":"7b7c9bef-3927-44b7-928a-eb9584ec90ed","Type":"ContainerStarted","Data":"cbd9cffa8016ef610aece3744b0ef16ea72e48f8b3510f630456935e2b27801b"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.675769 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2351-account-create-update-q5sln" event={"ID":"d859d976-7f57-49ae-88d5-b6f6d641f470","Type":"ContainerStarted","Data":"ed477f32c4b0228ee54a59c9e2ce28001d0ddb9e11b5588a3270813060d39165"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.677863 4901 generic.go:334] "Generic (PLEG): container finished" podID="24d314f1-b9e6-4e62-a599-533cf470eea2" containerID="74a3a02c4d45f49079e11017d06fc4aaed58d9249d701467b17779e608a64b8d" exitCode=0 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.677926 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6237-account-create-update-fdqwt" event={"ID":"24d314f1-b9e6-4e62-a599-533cf470eea2","Type":"ContainerDied","Data":"74a3a02c4d45f49079e11017d06fc4aaed58d9249d701467b17779e608a64b8d"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.707893 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed6baff-0886-4acb-b063-5807ec75d169" path="/var/lib/kubelet/pods/1ed6baff-0886-4acb-b063-5807ec75d169/volumes" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.714519 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e62ea67-c139-4682-8563-441d8b7aeae6" path="/var/lib/kubelet/pods/5e62ea67-c139-4682-8563-441d8b7aeae6/volumes" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.715308 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb9c059-9f39-4478-9b04-9417b05f4bef" path="/var/lib/kubelet/pods/bcb9c059-9f39-4478-9b04-9417b05f4bef/volumes" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.717487 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c6c553-9669-4fc0-a72b-9a528764e7a8" path="/var/lib/kubelet/pods/e2c6c553-9669-4fc0-a72b-9a528764e7a8/volumes" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.718296 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" event={"ID":"ed964b8c-458f-4b9f-8363-55627008bc75","Type":"ContainerStarted","Data":"122de86006846086085e45b3ff6c997e08916148e9174d2f1184e44518a81381"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.718326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgmtn" event={"ID":"96853bfa-70bb-4ce9-9900-618896081d6d","Type":"ContainerStarted","Data":"f0036eab05ea1b3743558bafc8ea5020f56f9ce140b6001159a65cc785cd2a56"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.801883 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" event={"ID":"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e","Type":"ContainerStarted","Data":"b4bab7a22af360290d3581683b194033e19fa20e495ffef93f963d282d2bc3c3"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.853024 4901 generic.go:334] "Generic (PLEG): container finished" podID="169d32ef-3d4c-4f19-861a-afbc638d72df" containerID="ae7f807d1b63b3eedbe8e20fffe0122a2213dc96dbc663b8ffbc58a4d4ad4f2e" exitCode=0 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.853384 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cf5f57d85-xv6g6" event={"ID":"169d32ef-3d4c-4f19-861a-afbc638d72df","Type":"ContainerDied","Data":"ae7f807d1b63b3eedbe8e20fffe0122a2213dc96dbc663b8ffbc58a4d4ad4f2e"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.856160 4901 generic.go:334] "Generic (PLEG): container finished" podID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" containerID="86e46be019fa316f9871af43167766ec6e02b7f1313dfdfc1afcf183f85daa66" exitCode=0 Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.856210 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" event={"ID":"c747aeb2-de75-49ae-b04e-c2e1cd27b77d","Type":"ContainerDied","Data":"86e46be019fa316f9871af43167766ec6e02b7f1313dfdfc1afcf183f85daa66"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.860245 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-96f76df57-tdlmx" event={"ID":"4d2b7428-ac02-4aed-8a90-30cc198e4cca","Type":"ContainerStarted","Data":"397d86a535838ca4c9f6a6a6905b532e44646702e33ee736b64fa83d6c73cf74"} Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.860317 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.860335 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.871080 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.904650 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-96f76df57-tdlmx" podStartSLOduration=12.904631717000001 podStartE2EDuration="12.904631717s" podCreationTimestamp="2026-02-02 10:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:17.890860712 +0000 UTC m=+1124.909200798" watchObservedRunningTime="2026-02-02 10:57:17.904631717 +0000 UTC m=+1124.922971813" Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.965968 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle\") pod \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.966257 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data\") pod \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.966299 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom\") pod \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.966434 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cffg7\" (UniqueName: \"kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7\") pod \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\" (UID: \"c747aeb2-de75-49ae-b04e-c2e1cd27b77d\") " Feb 02 10:57:17 crc kubenswrapper[4901]: I0202 10:57:17.983821 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c747aeb2-de75-49ae-b04e-c2e1cd27b77d" (UID: "c747aeb2-de75-49ae-b04e-c2e1cd27b77d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.008887 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7" (OuterVolumeSpecName: "kube-api-access-cffg7") pod "c747aeb2-de75-49ae-b04e-c2e1cd27b77d" (UID: "c747aeb2-de75-49ae-b04e-c2e1cd27b77d"). InnerVolumeSpecName "kube-api-access-cffg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.072190 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.072308 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cffg7\" (UniqueName: \"kubernetes.io/projected/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-kube-api-access-cffg7\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.098513 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c747aeb2-de75-49ae-b04e-c2e1cd27b77d" (UID: "c747aeb2-de75-49ae-b04e-c2e1cd27b77d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.185963 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.200956 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.202042 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:18 crc kubenswrapper[4901]: W0202 10:57:18.251304 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9f5e41c_41a8_48c7_becf_3c621090fec2.slice/crio-6c00f1d68e51fc5afa1ad8ba8785fda94aa0eacec69771c9d042ec25985b3fae WatchSource:0}: Error finding container 6c00f1d68e51fc5afa1ad8ba8785fda94aa0eacec69771c9d042ec25985b3fae: Status 404 returned error can't find the container with id 6c00f1d68e51fc5afa1ad8ba8785fda94aa0eacec69771c9d042ec25985b3fae Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.288024 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom\") pod \"169d32ef-3d4c-4f19-861a-afbc638d72df\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.288433 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data\") pod \"169d32ef-3d4c-4f19-861a-afbc638d72df\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.288609 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdrhl\" (UniqueName: \"kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl\") pod \"169d32ef-3d4c-4f19-861a-afbc638d72df\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.288972 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle\") pod \"169d32ef-3d4c-4f19-861a-afbc638d72df\" (UID: \"169d32ef-3d4c-4f19-861a-afbc638d72df\") " Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.358409 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "169d32ef-3d4c-4f19-861a-afbc638d72df" (UID: "169d32ef-3d4c-4f19-861a-afbc638d72df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.378688 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl" (OuterVolumeSpecName: "kube-api-access-wdrhl") pod "169d32ef-3d4c-4f19-861a-afbc638d72df" (UID: "169d32ef-3d4c-4f19-861a-afbc638d72df"). InnerVolumeSpecName "kube-api-access-wdrhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.394925 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.394964 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdrhl\" (UniqueName: \"kubernetes.io/projected/169d32ef-3d4c-4f19-861a-afbc638d72df-kube-api-access-wdrhl\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.517291 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:57:18 crc kubenswrapper[4901]: W0202 10:57:18.618030 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5731a76d_bd25_4a51_acfc_7dfd031eef35.slice/crio-cc1a377d600f0741767991dd1dcf0f4883a5536454ffa10af4883628e191c9b0 WatchSource:0}: Error finding container cc1a377d600f0741767991dd1dcf0f4883a5536454ffa10af4883628e191c9b0: Status 404 returned error can't find the container with id cc1a377d600f0741767991dd1dcf0f4883a5536454ffa10af4883628e191c9b0 Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.784776 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "169d32ef-3d4c-4f19-861a-afbc638d72df" (UID: "169d32ef-3d4c-4f19-861a-afbc638d72df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.816994 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.880589 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerStarted","Data":"6c00f1d68e51fc5afa1ad8ba8785fda94aa0eacec69771c9d042ec25985b3fae"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.887945 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-96f76df57-tdlmx" event={"ID":"4d2b7428-ac02-4aed-8a90-30cc198e4cca","Type":"ContainerStarted","Data":"b59a8adf605e695b0653bc05ad220d3068846da78ff2d4fc94e2ef09d570ad30"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.898824 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" event={"ID":"d86c5f78-c1ef-4b91-b731-3b85c4fab45a","Type":"ContainerStarted","Data":"173ecbb71e16425dace4a364eb613cdf9937b09ad07c3b3f815c474f0a8d0543"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.899976 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.908792 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5cf5f57d85-xv6g6" event={"ID":"169d32ef-3d4c-4f19-861a-afbc638d72df","Type":"ContainerDied","Data":"fd142998a0eca5d50b2ff238658e4325174d7f7413fee5b10c5b16ec59bc644f"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.908858 4901 scope.go:117] "RemoveContainer" containerID="ae7f807d1b63b3eedbe8e20fffe0122a2213dc96dbc663b8ffbc58a4d4ad4f2e" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.908922 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5cf5f57d85-xv6g6" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.914820 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" event={"ID":"ed964b8c-458f-4b9f-8363-55627008bc75","Type":"ContainerStarted","Data":"f20f5c49833432a963b01b803587bc3d501d812a49185a0c664845c5eaa95802"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.915271 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.924899 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5731a76d-bd25-4a51-acfc-7dfd031eef35","Type":"ContainerStarted","Data":"cc1a377d600f0741767991dd1dcf0f4883a5536454ffa10af4883628e191c9b0"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.939776 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" podStartSLOduration=9.939752986 podStartE2EDuration="9.939752986s" podCreationTimestamp="2026-02-02 10:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:18.926132316 +0000 UTC m=+1125.944472402" watchObservedRunningTime="2026-02-02 10:57:18.939752986 +0000 UTC m=+1125.958093082" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.942243 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6548466b85-x76qz" event={"ID":"5acf34a1-ce29-4301-bd5e-e6792dff572d","Type":"ContainerStarted","Data":"e7d28f3896a8515754fac4b0226fc1026ebf8c967790e13f4dc692a67734ff51"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.943908 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.957014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" event={"ID":"c747aeb2-de75-49ae-b04e-c2e1cd27b77d","Type":"ContainerDied","Data":"78321197b0942735b01b21375c95a36ef6b388732e014334331cc9a7a1296269"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.957130 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-55bf54c5c5-ttz4h" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.962453 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9ddr2" event={"ID":"80f838bb-a2da-496c-9338-50c97222d215","Type":"ContainerStarted","Data":"252fa59c24152159ccf9762035b8bb767d7428a009f8e8be86aef2993df194a1"} Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.970351 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" podStartSLOduration=7.97033531 podStartE2EDuration="7.97033531s" podCreationTimestamp="2026-02-02 10:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:18.947144431 +0000 UTC m=+1125.965484527" watchObservedRunningTime="2026-02-02 10:57:18.97033531 +0000 UTC m=+1125.988675406" Feb 02 10:57:18 crc kubenswrapper[4901]: I0202 10:57:18.992459 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6548466b85-x76qz" podStartSLOduration=7.992438453 podStartE2EDuration="7.992438453s" podCreationTimestamp="2026-02-02 10:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:18.980978716 +0000 UTC m=+1125.999318812" watchObservedRunningTime="2026-02-02 10:57:18.992438453 +0000 UTC m=+1126.010778549" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.014776 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9ddr2" podStartSLOduration=9.01473302 podStartE2EDuration="9.01473302s" podCreationTimestamp="2026-02-02 10:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:19.012394201 +0000 UTC m=+1126.030734297" watchObservedRunningTime="2026-02-02 10:57:19.01473302 +0000 UTC m=+1126.033073116" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.191453 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data" (OuterVolumeSpecName: "config-data") pod "c747aeb2-de75-49ae-b04e-c2e1cd27b77d" (UID: "c747aeb2-de75-49ae-b04e-c2e1cd27b77d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.231361 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c747aeb2-de75-49ae-b04e-c2e1cd27b77d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.267778 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data" (OuterVolumeSpecName: "config-data") pod "169d32ef-3d4c-4f19-861a-afbc638d72df" (UID: "169d32ef-3d4c-4f19-861a-afbc638d72df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.336549 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/169d32ef-3d4c-4f19-861a-afbc638d72df-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.431443 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.541477 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.582231 4901 scope.go:117] "RemoveContainer" containerID="86e46be019fa316f9871af43167766ec6e02b7f1313dfdfc1afcf183f85daa66" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.585991 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-55bf54c5c5-ttz4h"] Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.705326 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" path="/var/lib/kubelet/pods/c747aeb2-de75-49ae-b04e-c2e1cd27b77d/volumes" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.741208 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.744136 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.762437 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5cf5f57d85-xv6g6"] Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.846421 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts\") pod \"24d314f1-b9e6-4e62-a599-533cf470eea2\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.846761 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x6cc\" (UniqueName: \"kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc\") pod \"24d314f1-b9e6-4e62-a599-533cf470eea2\" (UID: \"24d314f1-b9e6-4e62-a599-533cf470eea2\") " Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.849880 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24d314f1-b9e6-4e62-a599-533cf470eea2" (UID: "24d314f1-b9e6-4e62-a599-533cf470eea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.858779 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc" (OuterVolumeSpecName: "kube-api-access-5x6cc") pod "24d314f1-b9e6-4e62-a599-533cf470eea2" (UID: "24d314f1-b9e6-4e62-a599-533cf470eea2"). InnerVolumeSpecName "kube-api-access-5x6cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.949769 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d314f1-b9e6-4e62-a599-533cf470eea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:19 crc kubenswrapper[4901]: I0202 10:57:19.949799 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x6cc\" (UniqueName: \"kubernetes.io/projected/24d314f1-b9e6-4e62-a599-533cf470eea2-kube-api-access-5x6cc\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.013120 4901 generic.go:334] "Generic (PLEG): container finished" podID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerID="173ecbb71e16425dace4a364eb613cdf9937b09ad07c3b3f815c474f0a8d0543" exitCode=1 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.013206 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" event={"ID":"d86c5f78-c1ef-4b91-b731-3b85c4fab45a","Type":"ContainerDied","Data":"173ecbb71e16425dace4a364eb613cdf9937b09ad07c3b3f815c474f0a8d0543"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.015851 4901 scope.go:117] "RemoveContainer" containerID="173ecbb71e16425dace4a364eb613cdf9937b09ad07c3b3f815c474f0a8d0543" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.024846 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6237-account-create-update-fdqwt" event={"ID":"24d314f1-b9e6-4e62-a599-533cf470eea2","Type":"ContainerDied","Data":"9ad02b862d3dbb0bc1a115819634cf969a04266cdc0966595d8186ba85588656"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.024882 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad02b862d3dbb0bc1a115819634cf969a04266cdc0966595d8186ba85588656" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.024954 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6237-account-create-update-fdqwt" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.033795 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.037464 4901 generic.go:334] "Generic (PLEG): container finished" podID="d859d976-7f57-49ae-88d5-b6f6d641f470" containerID="ed65b107cc9d437e10c49c78175628db331a431bc3fbf2c78686746dd9b4bcd5" exitCode=0 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.037639 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2351-account-create-update-q5sln" event={"ID":"d859d976-7f57-49ae-88d5-b6f6d641f470","Type":"ContainerDied","Data":"ed65b107cc9d437e10c49c78175628db331a431bc3fbf2c78686746dd9b4bcd5"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.074485 4901 generic.go:334] "Generic (PLEG): container finished" podID="80f838bb-a2da-496c-9338-50c97222d215" containerID="252fa59c24152159ccf9762035b8bb767d7428a009f8e8be86aef2993df194a1" exitCode=0 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.074937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9ddr2" event={"ID":"80f838bb-a2da-496c-9338-50c97222d215","Type":"ContainerDied","Data":"252fa59c24152159ccf9762035b8bb767d7428a009f8e8be86aef2993df194a1"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.092734 4901 generic.go:334] "Generic (PLEG): container finished" podID="96853bfa-70bb-4ce9-9900-618896081d6d" containerID="a3707cc47d224b422f8095a177de8a5f9ee619c50d20851c0754e43d40ce0a4e" exitCode=0 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.092847 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgmtn" event={"ID":"96853bfa-70bb-4ce9-9900-618896081d6d","Type":"ContainerDied","Data":"a3707cc47d224b422f8095a177de8a5f9ee619c50d20851c0754e43d40ce0a4e"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.104298 4901 generic.go:334] "Generic (PLEG): container finished" podID="7b7c9bef-3927-44b7-928a-eb9584ec90ed" containerID="24da6466b7882c96be9d5883619b05a6b9bf9a76130936323ed843e7e44b2e14" exitCode=0 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.104382 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbl76" event={"ID":"7b7c9bef-3927-44b7-928a-eb9584ec90ed","Type":"ContainerDied","Data":"24da6466b7882c96be9d5883619b05a6b9bf9a76130936323ed843e7e44b2e14"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.115594 4901 generic.go:334] "Generic (PLEG): container finished" podID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerID="6e587a1f28bedfe4eaad33842d0a9e3bd12c49375f764ba8cb5b25b41d8189a7" exitCode=1 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.116331 4901 scope.go:117] "RemoveContainer" containerID="6e587a1f28bedfe4eaad33842d0a9e3bd12c49375f764ba8cb5b25b41d8189a7" Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.116844 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54947664d6-2bv6x" event={"ID":"fdf4e65f-ea75-489a-9183-e0a9f290345e","Type":"ContainerDied","Data":"6e587a1f28bedfe4eaad33842d0a9e3bd12c49375f764ba8cb5b25b41d8189a7"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.129105 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerStarted","Data":"fa9fc042c0e550fed3d6ac593a9f5637942cc659c391f62fdf644d0fc9b6339f"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.138342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d303ae2-764c-42f1-afaa-d099c91b5ac4","Type":"ContainerStarted","Data":"00bfc44d7c6761bdbcdce474d479ed2ef91ff9de3e96150a760c20a1d3dc691c"} Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.208393 4901 generic.go:334] "Generic (PLEG): container finished" podID="a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" containerID="3e42b46e5f7ae441b51e74662c769f4b33bf1d42c810cbadf0dcff123b843238" exitCode=0 Feb 02 10:57:20 crc kubenswrapper[4901]: I0202 10:57:20.208742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" event={"ID":"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e","Type":"ContainerDied","Data":"3e42b46e5f7ae441b51e74662c769f4b33bf1d42c810cbadf0dcff123b843238"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.247798 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5731a76d-bd25-4a51-acfc-7dfd031eef35","Type":"ContainerStarted","Data":"ca0c935df19ddabf87837e93cc3591883db07606370eacdde4dc796517db5f1f"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.252396 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d303ae2-764c-42f1-afaa-d099c91b5ac4","Type":"ContainerStarted","Data":"3ad83b1e5e174872eb3078c78dcfbc5042dfeacd5bcc5c996c191d440bcee81b"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.265262 4901 generic.go:334] "Generic (PLEG): container finished" podID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" exitCode=1 Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.265341 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" event={"ID":"d86c5f78-c1ef-4b91-b731-3b85c4fab45a","Type":"ContainerDied","Data":"2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.265406 4901 scope.go:117] "RemoveContainer" containerID="173ecbb71e16425dace4a364eb613cdf9937b09ad07c3b3f815c474f0a8d0543" Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.266134 4901 scope.go:117] "RemoveContainer" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" Feb 02 10:57:21 crc kubenswrapper[4901]: E0202 10:57:21.266331 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5d675845dc-qtbq4_openstack(d86c5f78-c1ef-4b91-b731-3b85c4fab45a)\"" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.289994 4901 generic.go:334] "Generic (PLEG): container finished" podID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerID="d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e" exitCode=1 Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.290117 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54947664d6-2bv6x" event={"ID":"fdf4e65f-ea75-489a-9183-e0a9f290345e","Type":"ContainerDied","Data":"d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.290792 4901 scope.go:117] "RemoveContainer" containerID="d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e" Feb 02 10:57:21 crc kubenswrapper[4901]: E0202 10:57:21.291012 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-54947664d6-2bv6x_openstack(fdf4e65f-ea75-489a-9183-e0a9f290345e)\"" pod="openstack/heat-api-54947664d6-2bv6x" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.311934 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerStarted","Data":"30df9816b14bad092dad59de8c4e1d42dda41e82c0bfd37d000951f695f6f193"} Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.329618 4901 scope.go:117] "RemoveContainer" containerID="6e587a1f28bedfe4eaad33842d0a9e3bd12c49375f764ba8cb5b25b41d8189a7" Feb 02 10:57:21 crc kubenswrapper[4901]: I0202 10:57:21.733317 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="169d32ef-3d4c-4f19-861a-afbc638d72df" path="/var/lib/kubelet/pods/169d32ef-3d4c-4f19-861a-afbc638d72df/volumes" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.169208 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.259301 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lnsx\" (UniqueName: \"kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx\") pod \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.259416 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts\") pod \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\" (UID: \"7b7c9bef-3927-44b7-928a-eb9584ec90ed\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.261008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b7c9bef-3927-44b7-928a-eb9584ec90ed" (UID: "7b7c9bef-3927-44b7-928a-eb9584ec90ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.283784 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx" (OuterVolumeSpecName: "kube-api-access-6lnsx") pod "7b7c9bef-3927-44b7-928a-eb9584ec90ed" (UID: "7b7c9bef-3927-44b7-928a-eb9584ec90ed"). InnerVolumeSpecName "kube-api-access-6lnsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.362448 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lnsx\" (UniqueName: \"kubernetes.io/projected/7b7c9bef-3927-44b7-928a-eb9584ec90ed-kube-api-access-6lnsx\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.362476 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b7c9bef-3927-44b7-928a-eb9584ec90ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.364427 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerStarted","Data":"536495a3c4602f8e894018503e2b09d10315a123f7ec8a0996b1981dc90ea86b"} Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.382961 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.400039 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.400217 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5731a76d-bd25-4a51-acfc-7dfd031eef35","Type":"ContainerStarted","Data":"b92443d330fa2dad512510b89e83ff87dcc812de627ff3e4c41543fe77702f78"} Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.412840 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.414744 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d303ae2-764c-42f1-afaa-d099c91b5ac4","Type":"ContainerStarted","Data":"73ea0c9e74b5707f616f1156b857adf0e8f0cb4c690d078084d16aac06f8df7e"} Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.419511 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.432362 4901 scope.go:117] "RemoveContainer" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" Feb 02 10:57:22 crc kubenswrapper[4901]: E0202 10:57:22.432619 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5d675845dc-qtbq4_openstack(d86c5f78-c1ef-4b91-b731-3b85c4fab45a)\"" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.453969 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbl76" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.455073 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbl76" event={"ID":"7b7c9bef-3927-44b7-928a-eb9584ec90ed","Type":"ContainerDied","Data":"cbd9cffa8016ef610aece3744b0ef16ea72e48f8b3510f630456935e2b27801b"} Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.455130 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd9cffa8016ef610aece3744b0ef16ea72e48f8b3510f630456935e2b27801b" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.465441 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87wfb\" (UniqueName: \"kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb\") pod \"96853bfa-70bb-4ce9-9900-618896081d6d\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.465637 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts\") pod \"96853bfa-70bb-4ce9-9900-618896081d6d\" (UID: \"96853bfa-70bb-4ce9-9900-618896081d6d\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.466252 4901 scope.go:117] "RemoveContainer" containerID="d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.466373 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96853bfa-70bb-4ce9-9900-618896081d6d" (UID: "96853bfa-70bb-4ce9-9900-618896081d6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.481786 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb" (OuterVolumeSpecName: "kube-api-access-87wfb") pod "96853bfa-70bb-4ce9-9900-618896081d6d" (UID: "96853bfa-70bb-4ce9-9900-618896081d6d"). InnerVolumeSpecName "kube-api-access-87wfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.502435 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.5024067819999996 podStartE2EDuration="6.502406782s" podCreationTimestamp="2026-02-02 10:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:22.466396952 +0000 UTC m=+1129.484737048" watchObservedRunningTime="2026-02-02 10:57:22.502406782 +0000 UTC m=+1129.520746878" Feb 02 10:57:22 crc kubenswrapper[4901]: E0202 10:57:22.466500 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-54947664d6-2bv6x_openstack(fdf4e65f-ea75-489a-9183-e0a9f290345e)\"" pod="openstack/heat-api-54947664d6-2bv6x" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570367 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr7vn\" (UniqueName: \"kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn\") pod \"d859d976-7f57-49ae-88d5-b6f6d641f470\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570419 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts\") pod \"80f838bb-a2da-496c-9338-50c97222d215\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570455 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts\") pod \"d859d976-7f57-49ae-88d5-b6f6d641f470\" (UID: \"d859d976-7f57-49ae-88d5-b6f6d641f470\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570520 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts\") pod \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570679 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfqkv\" (UniqueName: \"kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv\") pod \"80f838bb-a2da-496c-9338-50c97222d215\" (UID: \"80f838bb-a2da-496c-9338-50c97222d215\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.570789 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsm8g\" (UniqueName: \"kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g\") pod \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\" (UID: \"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e\") " Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.571331 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96853bfa-70bb-4ce9-9900-618896081d6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.571344 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87wfb\" (UniqueName: \"kubernetes.io/projected/96853bfa-70bb-4ce9-9900-618896081d6d-kube-api-access-87wfb\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.573498 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d859d976-7f57-49ae-88d5-b6f6d641f470" (UID: "d859d976-7f57-49ae-88d5-b6f6d641f470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.574797 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" (UID: "a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.574955 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80f838bb-a2da-496c-9338-50c97222d215" (UID: "80f838bb-a2da-496c-9338-50c97222d215"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.578184 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g" (OuterVolumeSpecName: "kube-api-access-bsm8g") pod "a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" (UID: "a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e"). InnerVolumeSpecName "kube-api-access-bsm8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.580081 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn" (OuterVolumeSpecName: "kube-api-access-fr7vn") pod "d859d976-7f57-49ae-88d5-b6f6d641f470" (UID: "d859d976-7f57-49ae-88d5-b6f6d641f470"). InnerVolumeSpecName "kube-api-access-fr7vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.580146 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv" (OuterVolumeSpecName: "kube-api-access-rfqkv") pod "80f838bb-a2da-496c-9338-50c97222d215" (UID: "80f838bb-a2da-496c-9338-50c97222d215"). InnerVolumeSpecName "kube-api-access-rfqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.636386 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.63636357 podStartE2EDuration="6.63636357s" podCreationTimestamp="2026-02-02 10:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:22.627227081 +0000 UTC m=+1129.645567177" watchObservedRunningTime="2026-02-02 10:57:22.63636357 +0000 UTC m=+1129.654703666" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673509 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsm8g\" (UniqueName: \"kubernetes.io/projected/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-kube-api-access-bsm8g\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673546 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr7vn\" (UniqueName: \"kubernetes.io/projected/d859d976-7f57-49ae-88d5-b6f6d641f470-kube-api-access-fr7vn\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673556 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f838bb-a2da-496c-9338-50c97222d215-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673567 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d859d976-7f57-49ae-88d5-b6f6d641f470-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673586 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:22 crc kubenswrapper[4901]: I0202 10:57:22.673614 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfqkv\" (UniqueName: \"kubernetes.io/projected/80f838bb-a2da-496c-9338-50c97222d215-kube-api-access-rfqkv\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.241074 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.492358 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2351-account-create-update-q5sln" event={"ID":"d859d976-7f57-49ae-88d5-b6f6d641f470","Type":"ContainerDied","Data":"ed477f32c4b0228ee54a59c9e2ce28001d0ddb9e11b5588a3270813060d39165"} Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.492395 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed477f32c4b0228ee54a59c9e2ce28001d0ddb9e11b5588a3270813060d39165" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.492467 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2351-account-create-update-q5sln" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.498309 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9ddr2" event={"ID":"80f838bb-a2da-496c-9338-50c97222d215","Type":"ContainerDied","Data":"47457ed89ba3a5708b9b8f725cd3f08dd9601a874c1dafe5a6eb2c168ee73015"} Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.498342 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47457ed89ba3a5708b9b8f725cd3f08dd9601a874c1dafe5a6eb2c168ee73015" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.498412 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9ddr2" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.513828 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" event={"ID":"a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e","Type":"ContainerDied","Data":"b4bab7a22af360290d3581683b194033e19fa20e495ffef93f963d282d2bc3c3"} Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.513893 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4bab7a22af360290d3581683b194033e19fa20e495ffef93f963d282d2bc3c3" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.513902 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-680b-account-create-update-fxm6v" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.516593 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tgmtn" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.517109 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tgmtn" event={"ID":"96853bfa-70bb-4ce9-9900-618896081d6d","Type":"ContainerDied","Data":"f0036eab05ea1b3743558bafc8ea5020f56f9ce140b6001159a65cc785cd2a56"} Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.517139 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0036eab05ea1b3743558bafc8ea5020f56f9ce140b6001159a65cc785cd2a56" Feb 02 10:57:23 crc kubenswrapper[4901]: I0202 10:57:23.793248 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:24 crc kubenswrapper[4901]: I0202 10:57:24.938281 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:24 crc kubenswrapper[4901]: I0202 10:57:24.938680 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:24 crc kubenswrapper[4901]: I0202 10:57:24.939409 4901 scope.go:117] "RemoveContainer" containerID="d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e" Feb 02 10:57:24 crc kubenswrapper[4901]: E0202 10:57:24.939674 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-54947664d6-2bv6x_openstack(fdf4e65f-ea75-489a-9183-e0a9f290345e)\"" pod="openstack/heat-api-54947664d6-2bv6x" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.033755 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.034354 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.035690 4901 scope.go:117] "RemoveContainer" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" Feb 02 10:57:25 crc kubenswrapper[4901]: E0202 10:57:25.036077 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5d675845dc-qtbq4_openstack(d86c5f78-c1ef-4b91-b731-3b85c4fab45a)\"" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.578488 4901 scope.go:117] "RemoveContainer" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" Feb 02 10:57:25 crc kubenswrapper[4901]: E0202 10:57:25.578788 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5d675845dc-qtbq4_openstack(d86c5f78-c1ef-4b91-b731-3b85c4fab45a)\"" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.579327 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-central-agent" containerID="cri-o://fa9fc042c0e550fed3d6ac593a9f5637942cc659c391f62fdf644d0fc9b6339f" gracePeriod=30 Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.579424 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerStarted","Data":"0c9848cd87ff4f412d6e42e6aaa610b1f16bfd5c00f2746a80d26d14c71ba33b"} Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.579501 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.581974 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="sg-core" containerID="cri-o://536495a3c4602f8e894018503e2b09d10315a123f7ec8a0996b1981dc90ea86b" gracePeriod=30 Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.582089 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="proxy-httpd" containerID="cri-o://0c9848cd87ff4f412d6e42e6aaa610b1f16bfd5c00f2746a80d26d14c71ba33b" gracePeriod=30 Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.582028 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-notification-agent" containerID="cri-o://30df9816b14bad092dad59de8c4e1d42dda41e82c0bfd37d000951f695f6f193" gracePeriod=30 Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.612515 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.339650379 podStartE2EDuration="9.612492177s" podCreationTimestamp="2026-02-02 10:57:16 +0000 UTC" firstStartedPulling="2026-02-02 10:57:18.260184282 +0000 UTC m=+1125.278524378" lastFinishedPulling="2026-02-02 10:57:24.53302608 +0000 UTC m=+1131.551366176" observedRunningTime="2026-02-02 10:57:25.605594575 +0000 UTC m=+1132.623934671" watchObservedRunningTime="2026-02-02 10:57:25.612492177 +0000 UTC m=+1132.630832273" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.820991 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:25 crc kubenswrapper[4901]: I0202 10:57:25.835093 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-96f76df57-tdlmx" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.271650 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tkq95"] Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272383 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" containerName="heat-cfnapi" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272398 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" containerName="heat-cfnapi" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272413 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169d32ef-3d4c-4f19-861a-afbc638d72df" containerName="heat-api" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272419 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="169d32ef-3d4c-4f19-861a-afbc638d72df" containerName="heat-api" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272434 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272441 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272451 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d314f1-b9e6-4e62-a599-533cf470eea2" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272457 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d314f1-b9e6-4e62-a599-533cf470eea2" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272471 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7c9bef-3927-44b7-928a-eb9584ec90ed" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272477 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7c9bef-3927-44b7-928a-eb9584ec90ed" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272490 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96853bfa-70bb-4ce9-9900-618896081d6d" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272496 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="96853bfa-70bb-4ce9-9900-618896081d6d" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272504 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d859d976-7f57-49ae-88d5-b6f6d641f470" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272509 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d859d976-7f57-49ae-88d5-b6f6d641f470" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: E0202 10:57:26.272526 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f838bb-a2da-496c-9338-50c97222d215" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272531 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f838bb-a2da-496c-9338-50c97222d215" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272713 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272732 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="169d32ef-3d4c-4f19-861a-afbc638d72df" containerName="heat-api" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272743 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d314f1-b9e6-4e62-a599-533cf470eea2" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272753 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f838bb-a2da-496c-9338-50c97222d215" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272764 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="96853bfa-70bb-4ce9-9900-618896081d6d" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272778 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7c9bef-3927-44b7-928a-eb9584ec90ed" containerName="mariadb-database-create" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272790 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c747aeb2-de75-49ae-b04e-c2e1cd27b77d" containerName="heat-cfnapi" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.272802 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d859d976-7f57-49ae-88d5-b6f6d641f470" containerName="mariadb-account-create-update" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.273389 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.276129 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.276363 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tffs" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.276881 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.300392 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tkq95"] Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.379371 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.379428 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.379481 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkrx4\" (UniqueName: \"kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.379557 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.481211 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.481298 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.481364 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.481431 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkrx4\" (UniqueName: \"kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.489289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.489711 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.490769 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.510463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkrx4\" (UniqueName: \"kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4\") pod \"nova-cell0-conductor-db-sync-tkq95\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.589909 4901 generic.go:334] "Generic (PLEG): container finished" podID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerID="0c9848cd87ff4f412d6e42e6aaa610b1f16bfd5c00f2746a80d26d14c71ba33b" exitCode=0 Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.589944 4901 generic.go:334] "Generic (PLEG): container finished" podID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerID="536495a3c4602f8e894018503e2b09d10315a123f7ec8a0996b1981dc90ea86b" exitCode=2 Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.589952 4901 generic.go:334] "Generic (PLEG): container finished" podID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerID="30df9816b14bad092dad59de8c4e1d42dda41e82c0bfd37d000951f695f6f193" exitCode=0 Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.589996 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerDied","Data":"0c9848cd87ff4f412d6e42e6aaa610b1f16bfd5c00f2746a80d26d14c71ba33b"} Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.590055 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerDied","Data":"536495a3c4602f8e894018503e2b09d10315a123f7ec8a0996b1981dc90ea86b"} Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.590069 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerDied","Data":"30df9816b14bad092dad59de8c4e1d42dda41e82c0bfd37d000951f695f6f193"} Feb 02 10:57:26 crc kubenswrapper[4901]: I0202 10:57:26.590229 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.157344 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tkq95"] Feb 02 10:57:27 crc kubenswrapper[4901]: W0202 10:57:27.166798 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8b136f1_ea46_4ebc_ade6_03f996a089ce.slice/crio-238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242 WatchSource:0}: Error finding container 238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242: Status 404 returned error can't find the container with id 238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242 Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.325445 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.325509 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.362696 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.392596 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.402793 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.402848 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.467866 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.481443 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.605991 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tkq95" event={"ID":"a8b136f1-ea46-4ebc-ade6-03f996a089ce","Type":"ContainerStarted","Data":"238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242"} Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.606590 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.606617 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.606626 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:27 crc kubenswrapper[4901]: I0202 10:57:27.606638 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.433668 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7dcd8c5f77-l2jdd" Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.518242 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.710702 4901 generic.go:334] "Generic (PLEG): container finished" podID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerID="299510ae2a67239f6106a2c52c9143358ea95db027b4490f1c10b74d9ffaa9c6" exitCode=0 Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.711704 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerDied","Data":"299510ae2a67239f6106a2c52c9143358ea95db027b4490f1c10b74d9ffaa9c6"} Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.712915 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6548466b85-x76qz" Feb 02 10:57:28 crc kubenswrapper[4901]: I0202 10:57:28.788588 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.096753 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.152613 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle\") pod \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.152659 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config\") pod \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.152783 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs\") pod \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.152807 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsmn4\" (UniqueName: \"kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4\") pod \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.152855 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config\") pod \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\" (UID: \"df6764f5-9f09-4b5a-bc5f-7d212c713ae8\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.175494 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "df6764f5-9f09-4b5a-bc5f-7d212c713ae8" (UID: "df6764f5-9f09-4b5a-bc5f-7d212c713ae8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.175706 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4" (OuterVolumeSpecName: "kube-api-access-jsmn4") pod "df6764f5-9f09-4b5a-bc5f-7d212c713ae8" (UID: "df6764f5-9f09-4b5a-bc5f-7d212c713ae8"). InnerVolumeSpecName "kube-api-access-jsmn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.257090 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsmn4\" (UniqueName: \"kubernetes.io/projected/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-kube-api-access-jsmn4\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.257141 4901 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.269677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config" (OuterVolumeSpecName: "config") pod "df6764f5-9f09-4b5a-bc5f-7d212c713ae8" (UID: "df6764f5-9f09-4b5a-bc5f-7d212c713ae8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.280176 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df6764f5-9f09-4b5a-bc5f-7d212c713ae8" (UID: "df6764f5-9f09-4b5a-bc5f-7d212c713ae8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.361242 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.361277 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.392741 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "df6764f5-9f09-4b5a-bc5f-7d212c713ae8" (UID: "df6764f5-9f09-4b5a-bc5f-7d212c713ae8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.458312 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.465160 4901 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6764f5-9f09-4b5a-bc5f-7d212c713ae8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.467934 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.567790 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle\") pod \"fdf4e65f-ea75-489a-9183-e0a9f290345e\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568052 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl\") pod \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568189 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjwsq\" (UniqueName: \"kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq\") pod \"fdf4e65f-ea75-489a-9183-e0a9f290345e\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568260 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom\") pod \"fdf4e65f-ea75-489a-9183-e0a9f290345e\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568414 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data\") pod \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568804 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle\") pod \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.568878 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom\") pod \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\" (UID: \"d86c5f78-c1ef-4b91-b731-3b85c4fab45a\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.569045 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data\") pod \"fdf4e65f-ea75-489a-9183-e0a9f290345e\" (UID: \"fdf4e65f-ea75-489a-9183-e0a9f290345e\") " Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.571805 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fdf4e65f-ea75-489a-9183-e0a9f290345e" (UID: "fdf4e65f-ea75-489a-9183-e0a9f290345e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.574214 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d86c5f78-c1ef-4b91-b731-3b85c4fab45a" (UID: "d86c5f78-c1ef-4b91-b731-3b85c4fab45a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.579880 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq" (OuterVolumeSpecName: "kube-api-access-vjwsq") pod "fdf4e65f-ea75-489a-9183-e0a9f290345e" (UID: "fdf4e65f-ea75-489a-9183-e0a9f290345e"). InnerVolumeSpecName "kube-api-access-vjwsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.589708 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl" (OuterVolumeSpecName: "kube-api-access-jl8tl") pod "d86c5f78-c1ef-4b91-b731-3b85c4fab45a" (UID: "d86c5f78-c1ef-4b91-b731-3b85c4fab45a"). InnerVolumeSpecName "kube-api-access-jl8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.624970 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d86c5f78-c1ef-4b91-b731-3b85c4fab45a" (UID: "d86c5f78-c1ef-4b91-b731-3b85c4fab45a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.653017 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data" (OuterVolumeSpecName: "config-data") pod "d86c5f78-c1ef-4b91-b731-3b85c4fab45a" (UID: "d86c5f78-c1ef-4b91-b731-3b85c4fab45a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.655684 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdf4e65f-ea75-489a-9183-e0a9f290345e" (UID: "fdf4e65f-ea75-489a-9183-e0a9f290345e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.674943 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.674995 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-kube-api-access-jl8tl\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.675010 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjwsq\" (UniqueName: \"kubernetes.io/projected/fdf4e65f-ea75-489a-9183-e0a9f290345e-kube-api-access-vjwsq\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.675020 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.675035 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.675045 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.675056 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d86c5f78-c1ef-4b91-b731-3b85c4fab45a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.680396 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data" (OuterVolumeSpecName: "config-data") pod "fdf4e65f-ea75-489a-9183-e0a9f290345e" (UID: "fdf4e65f-ea75-489a-9183-e0a9f290345e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.728617 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54947664d6-2bv6x" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.728742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54947664d6-2bv6x" event={"ID":"fdf4e65f-ea75-489a-9183-e0a9f290345e","Type":"ContainerDied","Data":"8b4f7de9b227163c36b2945db793c2a57e2993614c5001d3dad833f2de0b794d"} Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.728808 4901 scope.go:117] "RemoveContainer" containerID="d13f5ad375873084238af091b4a0132499b19492f45e8db13e63420e375a668e" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.736171 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f784b5584-t7x4s" event={"ID":"df6764f5-9f09-4b5a-bc5f-7d212c713ae8","Type":"ContainerDied","Data":"d79d0b09b88c431e3717b77251bca08446d1af35921816de8c3521a006a2bfc9"} Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.736199 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f784b5584-t7x4s" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.753996 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.755179 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5d675845dc-qtbq4" event={"ID":"d86c5f78-c1ef-4b91-b731-3b85c4fab45a","Type":"ContainerDied","Data":"92948b8d79084b86a6841347549820d9a90bf324e36e9fcf5c0f17d6de2c2519"} Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.759812 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.777123 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdf4e65f-ea75-489a-9183-e0a9f290345e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.778412 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-54947664d6-2bv6x"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.786553 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.798584 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f784b5584-t7x4s"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.799016 4901 scope.go:117] "RemoveContainer" containerID="2c8f2fdc80d2f7dcfdbc3cb555e2560c637c39277353ad397d669d7633a69a74" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.807159 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.815490 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5d675845dc-qtbq4"] Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.841276 4901 scope.go:117] "RemoveContainer" containerID="299510ae2a67239f6106a2c52c9143358ea95db027b4490f1c10b74d9ffaa9c6" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.895138 4901 scope.go:117] "RemoveContainer" containerID="2e942f5f4656497155ce266bee1565c165fd07167a35b6565aa0bd402e76e1ab" Feb 02 10:57:29 crc kubenswrapper[4901]: I0202 10:57:29.960150 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79dbc4cd68-ltht2" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.045032 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.045491 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-758cb7689c-5d5jx" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" containerName="heat-engine" containerID="cri-o://64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" gracePeriod=60 Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.330479 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.330697 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.487214 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.487664 4901 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.491276 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:57:30 crc kubenswrapper[4901]: I0202 10:57:30.794424 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:57:31 crc kubenswrapper[4901]: I0202 10:57:31.690409 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" path="/var/lib/kubelet/pods/d86c5f78-c1ef-4b91-b731-3b85c4fab45a/volumes" Feb 02 10:57:31 crc kubenswrapper[4901]: I0202 10:57:31.691000 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" path="/var/lib/kubelet/pods/df6764f5-9f09-4b5a-bc5f-7d212c713ae8/volumes" Feb 02 10:57:31 crc kubenswrapper[4901]: I0202 10:57:31.691562 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" path="/var/lib/kubelet/pods/fdf4e65f-ea75-489a-9183-e0a9f290345e/volumes" Feb 02 10:57:32 crc kubenswrapper[4901]: I0202 10:57:32.824745 4901 generic.go:334] "Generic (PLEG): container finished" podID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerID="fa9fc042c0e550fed3d6ac593a9f5637942cc659c391f62fdf644d0fc9b6339f" exitCode=0 Feb 02 10:57:32 crc kubenswrapper[4901]: I0202 10:57:32.825086 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerDied","Data":"fa9fc042c0e550fed3d6ac593a9f5637942cc659c391f62fdf644d0fc9b6339f"} Feb 02 10:57:33 crc kubenswrapper[4901]: E0202 10:57:33.185701 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:57:33 crc kubenswrapper[4901]: E0202 10:57:33.187051 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:57:33 crc kubenswrapper[4901]: E0202 10:57:33.188258 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:57:33 crc kubenswrapper[4901]: E0202 10:57:33.188301 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-758cb7689c-5d5jx" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" containerName="heat-engine" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.748274 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.834776 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.834862 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.834917 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5k7x\" (UniqueName: \"kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.834945 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.835091 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.835147 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.835166 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data\") pod \"d9f5e41c-41a8-48c7-becf-3c621090fec2\" (UID: \"d9f5e41c-41a8-48c7-becf-3c621090fec2\") " Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.835354 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.836228 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.843070 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.850700 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts" (OuterVolumeSpecName: "scripts") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.856935 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x" (OuterVolumeSpecName: "kube-api-access-f5k7x") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "kube-api-access-f5k7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.903117 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f5e41c-41a8-48c7-becf-3c621090fec2","Type":"ContainerDied","Data":"6c00f1d68e51fc5afa1ad8ba8785fda94aa0eacec69771c9d042ec25985b3fae"} Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.903185 4901 scope.go:117] "RemoveContainer" containerID="0c9848cd87ff4f412d6e42e6aaa610b1f16bfd5c00f2746a80d26d14c71ba33b" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.903198 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.938470 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5k7x\" (UniqueName: \"kubernetes.io/projected/d9f5e41c-41a8-48c7-becf-3c621090fec2-kube-api-access-f5k7x\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.938519 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.938533 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f5e41c-41a8-48c7-becf-3c621090fec2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:38 crc kubenswrapper[4901]: I0202 10:57:38.967195 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.011734 4901 scope.go:117] "RemoveContainer" containerID="536495a3c4602f8e894018503e2b09d10315a123f7ec8a0996b1981dc90ea86b" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.035023 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.039671 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data" (OuterVolumeSpecName: "config-data") pod "d9f5e41c-41a8-48c7-becf-3c621090fec2" (UID: "d9f5e41c-41a8-48c7-becf-3c621090fec2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.041009 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.041044 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.041058 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f5e41c-41a8-48c7-becf-3c621090fec2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.052548 4901 scope.go:117] "RemoveContainer" containerID="30df9816b14bad092dad59de8c4e1d42dda41e82c0bfd37d000951f695f6f193" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.089728 4901 scope.go:117] "RemoveContainer" containerID="fa9fc042c0e550fed3d6ac593a9f5637942cc659c391f62fdf644d0fc9b6339f" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.245000 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.271703 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.291731 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292198 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="sg-core" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292212 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="sg-core" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292242 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292248 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292264 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292270 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292280 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292288 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292300 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="proxy-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292306 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="proxy-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292314 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-notification-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292320 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-notification-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292328 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292334 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292345 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-central-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292352 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-central-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292364 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292370 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: E0202 10:57:39.292386 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292393 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292557 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="sg-core" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292588 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292599 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="proxy-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292611 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-central-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292621 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292631 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf4e65f-ea75-489a-9183-e0a9f290345e" containerName="heat-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292640 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86c5f78-c1ef-4b91-b731-3b85c4fab45a" containerName="heat-cfnapi" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292654 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-api" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292664 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6764f5-9f09-4b5a-bc5f-7d212c713ae8" containerName="neutron-httpd" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.292674 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" containerName="ceilometer-notification-agent" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.294296 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.304198 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.304549 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.310047 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449097 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449154 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449181 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449210 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449236 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw8wx\" (UniqueName: \"kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449310 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.449355 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551360 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw8wx\" (UniqueName: \"kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551476 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551525 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551592 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551608 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551628 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.551651 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.552292 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.552601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.556586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.557517 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.568012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.568189 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.571019 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw8wx\" (UniqueName: \"kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx\") pod \"ceilometer-0\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.623768 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.695139 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f5e41c-41a8-48c7-becf-3c621090fec2" path="/var/lib/kubelet/pods/d9f5e41c-41a8-48c7-becf-3c621090fec2/volumes" Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.934637 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tkq95" event={"ID":"a8b136f1-ea46-4ebc-ade6-03f996a089ce","Type":"ContainerStarted","Data":"a2a7d78b63e56ffd1ba178e49403ee5ef5281be5b212d55719d95fa6b95e1047"} Feb 02 10:57:39 crc kubenswrapper[4901]: I0202 10:57:39.953642 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tkq95" podStartSLOduration=2.53236501 podStartE2EDuration="13.953625914s" podCreationTimestamp="2026-02-02 10:57:26 +0000 UTC" firstStartedPulling="2026-02-02 10:57:27.171435258 +0000 UTC m=+1134.189775354" lastFinishedPulling="2026-02-02 10:57:38.592696162 +0000 UTC m=+1145.611036258" observedRunningTime="2026-02-02 10:57:39.950464704 +0000 UTC m=+1146.968804800" watchObservedRunningTime="2026-02-02 10:57:39.953625914 +0000 UTC m=+1146.971966010" Feb 02 10:57:40 crc kubenswrapper[4901]: I0202 10:57:40.111093 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:40 crc kubenswrapper[4901]: I0202 10:57:40.948778 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerStarted","Data":"10b2f585291c76ac994f2b7b42c64b19893b4959d7a983a8e5c77ee847e978b3"} Feb 02 10:57:40 crc kubenswrapper[4901]: I0202 10:57:40.949180 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerStarted","Data":"bc72430319032e3a7038d82296742639b49adecb85c39ab73cd3f89a2266ddda"} Feb 02 10:57:41 crc kubenswrapper[4901]: I0202 10:57:41.061173 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:41 crc kubenswrapper[4901]: I0202 10:57:41.974759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerStarted","Data":"abba16204cde7fdb1c3cab757f25baa00f6068ba1443586f80e59c15e060d5cf"} Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.508512 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.630054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle\") pod \"9729bd04-c205-4f62-b74c-92df193ad13e\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.630291 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data\") pod \"9729bd04-c205-4f62-b74c-92df193ad13e\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.630342 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7qnr\" (UniqueName: \"kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr\") pod \"9729bd04-c205-4f62-b74c-92df193ad13e\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.630411 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom\") pod \"9729bd04-c205-4f62-b74c-92df193ad13e\" (UID: \"9729bd04-c205-4f62-b74c-92df193ad13e\") " Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.640319 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9729bd04-c205-4f62-b74c-92df193ad13e" (UID: "9729bd04-c205-4f62-b74c-92df193ad13e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.643767 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr" (OuterVolumeSpecName: "kube-api-access-p7qnr") pod "9729bd04-c205-4f62-b74c-92df193ad13e" (UID: "9729bd04-c205-4f62-b74c-92df193ad13e"). InnerVolumeSpecName "kube-api-access-p7qnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.681848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9729bd04-c205-4f62-b74c-92df193ad13e" (UID: "9729bd04-c205-4f62-b74c-92df193ad13e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.703595 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data" (OuterVolumeSpecName: "config-data") pod "9729bd04-c205-4f62-b74c-92df193ad13e" (UID: "9729bd04-c205-4f62-b74c-92df193ad13e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.733709 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.733748 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7qnr\" (UniqueName: \"kubernetes.io/projected/9729bd04-c205-4f62-b74c-92df193ad13e-kube-api-access-p7qnr\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.733764 4901 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.733776 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729bd04-c205-4f62-b74c-92df193ad13e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.986971 4901 generic.go:334] "Generic (PLEG): container finished" podID="9729bd04-c205-4f62-b74c-92df193ad13e" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" exitCode=0 Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.987049 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-758cb7689c-5d5jx" event={"ID":"9729bd04-c205-4f62-b74c-92df193ad13e","Type":"ContainerDied","Data":"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561"} Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.987094 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-758cb7689c-5d5jx" event={"ID":"9729bd04-c205-4f62-b74c-92df193ad13e","Type":"ContainerDied","Data":"87bec3ee032cb978e46d257e4f32d4f408e321ee45f4db3e888e49e9bf7663de"} Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.987117 4901 scope.go:117] "RemoveContainer" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" Feb 02 10:57:42 crc kubenswrapper[4901]: I0202 10:57:42.987266 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-758cb7689c-5d5jx" Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.016531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerStarted","Data":"87efee1744d627043127ba1085f0e9a6aec17f8da4af7c436089029947f3d949"} Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.031991 4901 scope.go:117] "RemoveContainer" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" Feb 02 10:57:43 crc kubenswrapper[4901]: E0202 10:57:43.035253 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561\": container with ID starting with 64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561 not found: ID does not exist" containerID="64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561" Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.035307 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561"} err="failed to get container status \"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561\": rpc error: code = NotFound desc = could not find container \"64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561\": container with ID starting with 64f52b2d773ad79fa8c0f163e2f84dfc92b3b8ca9a150a759620f148c4fa1561 not found: ID does not exist" Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.048905 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.057335 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-758cb7689c-5d5jx"] Feb 02 10:57:43 crc kubenswrapper[4901]: I0202 10:57:43.701360 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" path="/var/lib/kubelet/pods/9729bd04-c205-4f62-b74c-92df193ad13e/volumes" Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.041298 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerStarted","Data":"2f9655006ab9de664db75fa658bb5324241c0ecc880249bac4d8a019f3dc7053"} Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.041602 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-central-agent" containerID="cri-o://10b2f585291c76ac994f2b7b42c64b19893b4959d7a983a8e5c77ee847e978b3" gracePeriod=30 Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.041842 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="sg-core" containerID="cri-o://87efee1744d627043127ba1085f0e9a6aec17f8da4af7c436089029947f3d949" gracePeriod=30 Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.041857 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="proxy-httpd" containerID="cri-o://2f9655006ab9de664db75fa658bb5324241c0ecc880249bac4d8a019f3dc7053" gracePeriod=30 Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.042118 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.041842 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-notification-agent" containerID="cri-o://abba16204cde7fdb1c3cab757f25baa00f6068ba1443586f80e59c15e060d5cf" gracePeriod=30 Feb 02 10:57:45 crc kubenswrapper[4901]: I0202 10:57:45.080208 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.602710016 podStartE2EDuration="6.080184384s" podCreationTimestamp="2026-02-02 10:57:39 +0000 UTC" firstStartedPulling="2026-02-02 10:57:40.138070293 +0000 UTC m=+1147.156410389" lastFinishedPulling="2026-02-02 10:57:44.615544661 +0000 UTC m=+1151.633884757" observedRunningTime="2026-02-02 10:57:45.066729768 +0000 UTC m=+1152.085069874" watchObservedRunningTime="2026-02-02 10:57:45.080184384 +0000 UTC m=+1152.098524480" Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059463 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerID="2f9655006ab9de664db75fa658bb5324241c0ecc880249bac4d8a019f3dc7053" exitCode=0 Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059512 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerID="87efee1744d627043127ba1085f0e9a6aec17f8da4af7c436089029947f3d949" exitCode=2 Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059525 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerID="abba16204cde7fdb1c3cab757f25baa00f6068ba1443586f80e59c15e060d5cf" exitCode=0 Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059573 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerDied","Data":"2f9655006ab9de664db75fa658bb5324241c0ecc880249bac4d8a019f3dc7053"} Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059634 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerDied","Data":"87efee1744d627043127ba1085f0e9a6aec17f8da4af7c436089029947f3d949"} Feb 02 10:57:46 crc kubenswrapper[4901]: I0202 10:57:46.059653 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerDied","Data":"abba16204cde7fdb1c3cab757f25baa00f6068ba1443586f80e59c15e060d5cf"} Feb 02 10:57:51 crc kubenswrapper[4901]: I0202 10:57:51.138281 4901 generic.go:334] "Generic (PLEG): container finished" podID="a8b136f1-ea46-4ebc-ade6-03f996a089ce" containerID="a2a7d78b63e56ffd1ba178e49403ee5ef5281be5b212d55719d95fa6b95e1047" exitCode=0 Feb 02 10:57:51 crc kubenswrapper[4901]: I0202 10:57:51.138362 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tkq95" event={"ID":"a8b136f1-ea46-4ebc-ade6-03f996a089ce","Type":"ContainerDied","Data":"a2a7d78b63e56ffd1ba178e49403ee5ef5281be5b212d55719d95fa6b95e1047"} Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.533552 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.659393 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data\") pod \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.659525 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts\") pod \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.659615 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle\") pod \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.659734 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkrx4\" (UniqueName: \"kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4\") pod \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\" (UID: \"a8b136f1-ea46-4ebc-ade6-03f996a089ce\") " Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.667805 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts" (OuterVolumeSpecName: "scripts") pod "a8b136f1-ea46-4ebc-ade6-03f996a089ce" (UID: "a8b136f1-ea46-4ebc-ade6-03f996a089ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.669029 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4" (OuterVolumeSpecName: "kube-api-access-kkrx4") pod "a8b136f1-ea46-4ebc-ade6-03f996a089ce" (UID: "a8b136f1-ea46-4ebc-ade6-03f996a089ce"). InnerVolumeSpecName "kube-api-access-kkrx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.695659 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8b136f1-ea46-4ebc-ade6-03f996a089ce" (UID: "a8b136f1-ea46-4ebc-ade6-03f996a089ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.696025 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data" (OuterVolumeSpecName: "config-data") pod "a8b136f1-ea46-4ebc-ade6-03f996a089ce" (UID: "a8b136f1-ea46-4ebc-ade6-03f996a089ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.761784 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.762274 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.762286 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkrx4\" (UniqueName: \"kubernetes.io/projected/a8b136f1-ea46-4ebc-ade6-03f996a089ce-kube-api-access-kkrx4\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:52 crc kubenswrapper[4901]: I0202 10:57:52.762295 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8b136f1-ea46-4ebc-ade6-03f996a089ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.183332 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tkq95" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.183531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tkq95" event={"ID":"a8b136f1-ea46-4ebc-ade6-03f996a089ce","Type":"ContainerDied","Data":"238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242"} Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.183587 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="238977665af7eed6862da491dd0cbb58d5f999516059e9ede779e8d7d8914242" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.192383 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerID="10b2f585291c76ac994f2b7b42c64b19893b4959d7a983a8e5c77ee847e978b3" exitCode=0 Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.192428 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerDied","Data":"10b2f585291c76ac994f2b7b42c64b19893b4959d7a983a8e5c77ee847e978b3"} Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.339740 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:57:53 crc kubenswrapper[4901]: E0202 10:57:53.340134 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" containerName="heat-engine" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.340148 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" containerName="heat-engine" Feb 02 10:57:53 crc kubenswrapper[4901]: E0202 10:57:53.340181 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b136f1-ea46-4ebc-ade6-03f996a089ce" containerName="nova-cell0-conductor-db-sync" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.340188 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b136f1-ea46-4ebc-ade6-03f996a089ce" containerName="nova-cell0-conductor-db-sync" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.340354 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b136f1-ea46-4ebc-ade6-03f996a089ce" containerName="nova-cell0-conductor-db-sync" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.340375 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9729bd04-c205-4f62-b74c-92df193ad13e" containerName="heat-engine" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.340971 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.347463 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tffs" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.350247 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.375708 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.432735 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.477966 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlj7\" (UniqueName: \"kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.478304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.478441 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580187 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580275 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580369 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580399 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw8wx\" (UniqueName: \"kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580445 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580507 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580537 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data\") pod \"3a4362e1-62a8-4e64-8b18-3d251360de00\" (UID: \"3a4362e1-62a8-4e64-8b18-3d251360de00\") " Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580877 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlj7\" (UniqueName: \"kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580958 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.580987 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.581024 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.581753 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.588015 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx" (OuterVolumeSpecName: "kube-api-access-gw8wx") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "kube-api-access-gw8wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.588176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.588434 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts" (OuterVolumeSpecName: "scripts") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.597281 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.606225 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlj7\" (UniqueName: \"kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7\") pod \"nova-cell0-conductor-0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.617815 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.658836 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685083 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685136 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685150 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685165 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw8wx\" (UniqueName: \"kubernetes.io/projected/3a4362e1-62a8-4e64-8b18-3d251360de00-kube-api-access-gw8wx\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685183 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a4362e1-62a8-4e64-8b18-3d251360de00-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.685180 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.741772 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data" (OuterVolumeSpecName: "config-data") pod "3a4362e1-62a8-4e64-8b18-3d251360de00" (UID: "3a4362e1-62a8-4e64-8b18-3d251360de00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.791666 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:53 crc kubenswrapper[4901]: I0202 10:57:53.791707 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4362e1-62a8-4e64-8b18-3d251360de00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.114500 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.204326 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5869a9a7-8b2b-443e-be13-d92f4db643f0","Type":"ContainerStarted","Data":"f5c980f80d27f3da248c5f44bea8e87343c368d493de22e91d263da2bc5a4a84"} Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.209739 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a4362e1-62a8-4e64-8b18-3d251360de00","Type":"ContainerDied","Data":"bc72430319032e3a7038d82296742639b49adecb85c39ab73cd3f89a2266ddda"} Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.209791 4901 scope.go:117] "RemoveContainer" containerID="2f9655006ab9de664db75fa658bb5324241c0ecc880249bac4d8a019f3dc7053" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.210011 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.235105 4901 scope.go:117] "RemoveContainer" containerID="87efee1744d627043127ba1085f0e9a6aec17f8da4af7c436089029947f3d949" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.258233 4901 scope.go:117] "RemoveContainer" containerID="abba16204cde7fdb1c3cab757f25baa00f6068ba1443586f80e59c15e060d5cf" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.275325 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.289606 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301116 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301368 4901 scope.go:117] "RemoveContainer" containerID="10b2f585291c76ac994f2b7b42c64b19893b4959d7a983a8e5c77ee847e978b3" Feb 02 10:57:54 crc kubenswrapper[4901]: E0202 10:57:54.301538 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-notification-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301571 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-notification-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: E0202 10:57:54.301591 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="sg-core" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301597 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="sg-core" Feb 02 10:57:54 crc kubenswrapper[4901]: E0202 10:57:54.301608 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-central-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301615 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-central-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: E0202 10:57:54.301626 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="proxy-httpd" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301632 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="proxy-httpd" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301820 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="sg-core" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301838 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="proxy-httpd" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301851 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-central-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.301859 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" containerName="ceilometer-notification-agent" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.309895 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.312915 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.322939 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.350859 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402039 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402153 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn846\" (UniqueName: \"kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402201 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402262 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402280 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.402302 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504092 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504159 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504192 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn846\" (UniqueName: \"kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504254 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504333 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504356 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.504385 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.506404 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.506768 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.511132 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.512771 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.513550 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.514818 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.525995 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn846\" (UniqueName: \"kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846\") pod \"ceilometer-0\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " pod="openstack/ceilometer-0" Feb 02 10:57:54 crc kubenswrapper[4901]: I0202 10:57:54.669525 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.079537 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.140003 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:55 crc kubenswrapper[4901]: W0202 10:57:55.141226 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d9eeed6_b7b9_48f1_b961_7dd7bbcb861a.slice/crio-f5f6271bdcdc83db346acddc14434b7cfc0b0b172a4e3d1025e48d6cfd10a776 WatchSource:0}: Error finding container f5f6271bdcdc83db346acddc14434b7cfc0b0b172a4e3d1025e48d6cfd10a776: Status 404 returned error can't find the container with id f5f6271bdcdc83db346acddc14434b7cfc0b0b172a4e3d1025e48d6cfd10a776 Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.219545 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5869a9a7-8b2b-443e-be13-d92f4db643f0","Type":"ContainerStarted","Data":"c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16"} Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.220346 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.221623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerStarted","Data":"f5f6271bdcdc83db346acddc14434b7cfc0b0b172a4e3d1025e48d6cfd10a776"} Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.247846 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.247814828 podStartE2EDuration="2.247814828s" podCreationTimestamp="2026-02-02 10:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:57:55.236051194 +0000 UTC m=+1162.254391310" watchObservedRunningTime="2026-02-02 10:57:55.247814828 +0000 UTC m=+1162.266154964" Feb 02 10:57:55 crc kubenswrapper[4901]: I0202 10:57:55.693688 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4362e1-62a8-4e64-8b18-3d251360de00" path="/var/lib/kubelet/pods/3a4362e1-62a8-4e64-8b18-3d251360de00/volumes" Feb 02 10:57:56 crc kubenswrapper[4901]: I0202 10:57:56.190076 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:57:56 crc kubenswrapper[4901]: I0202 10:57:56.236014 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerStarted","Data":"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9"} Feb 02 10:57:56 crc kubenswrapper[4901]: I0202 10:57:56.236151 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" gracePeriod=30 Feb 02 10:57:57 crc kubenswrapper[4901]: I0202 10:57:57.248638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerStarted","Data":"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f"} Feb 02 10:57:58 crc kubenswrapper[4901]: I0202 10:57:58.263753 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerStarted","Data":"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9"} Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.286144 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerStarted","Data":"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37"} Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.287079 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.286435 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="sg-core" containerID="cri-o://516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9" gracePeriod=30 Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.286432 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-notification-agent" containerID="cri-o://75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f" gracePeriod=30 Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.286474 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-central-agent" containerID="cri-o://5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9" gracePeriod=30 Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.286421 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="proxy-httpd" containerID="cri-o://c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37" gracePeriod=30 Feb 02 10:58:00 crc kubenswrapper[4901]: I0202 10:58:00.325454 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.382863602 podStartE2EDuration="6.325433916s" podCreationTimestamp="2026-02-02 10:57:54 +0000 UTC" firstStartedPulling="2026-02-02 10:57:55.144783592 +0000 UTC m=+1162.163123698" lastFinishedPulling="2026-02-02 10:57:59.087353886 +0000 UTC m=+1166.105694012" observedRunningTime="2026-02-02 10:58:00.315919832 +0000 UTC m=+1167.334259928" watchObservedRunningTime="2026-02-02 10:58:00.325433916 +0000 UTC m=+1167.343774012" Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.297974 4901 generic.go:334] "Generic (PLEG): container finished" podID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerID="c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37" exitCode=0 Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.298529 4901 generic.go:334] "Generic (PLEG): container finished" podID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerID="516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9" exitCode=2 Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.298039 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerDied","Data":"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37"} Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.298607 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerDied","Data":"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9"} Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.298623 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerDied","Data":"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f"} Feb 02 10:58:01 crc kubenswrapper[4901]: I0202 10:58:01.298546 4901 generic.go:334] "Generic (PLEG): container finished" podID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerID="75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f" exitCode=0 Feb 02 10:58:03 crc kubenswrapper[4901]: E0202 10:58:03.661789 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:03 crc kubenswrapper[4901]: E0202 10:58:03.664242 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:03 crc kubenswrapper[4901]: E0202 10:58:03.667421 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:03 crc kubenswrapper[4901]: E0202 10:58:03.667497 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.163932 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.205817 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.205993 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206020 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206146 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206179 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206228 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206265 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.206613 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.207178 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn846\" (UniqueName: \"kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846\") pod \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\" (UID: \"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a\") " Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.207808 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.207830 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.214872 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts" (OuterVolumeSpecName: "scripts") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.216242 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846" (OuterVolumeSpecName: "kube-api-access-xn846") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "kube-api-access-xn846". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.276620 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.305294 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.309772 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.309821 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.309833 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.309843 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn846\" (UniqueName: \"kubernetes.io/projected/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-kube-api-access-xn846\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.335999 4901 generic.go:334] "Generic (PLEG): container finished" podID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerID="5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9" exitCode=0 Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.336061 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerDied","Data":"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9"} Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.336099 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a","Type":"ContainerDied","Data":"f5f6271bdcdc83db346acddc14434b7cfc0b0b172a4e3d1025e48d6cfd10a776"} Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.336122 4901 scope.go:117] "RemoveContainer" containerID="c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.336292 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.354601 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data" (OuterVolumeSpecName: "config-data") pod "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" (UID: "1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.396005 4901 scope.go:117] "RemoveContainer" containerID="516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.411323 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.425458 4901 scope.go:117] "RemoveContainer" containerID="75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.452701 4901 scope.go:117] "RemoveContainer" containerID="5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.484032 4901 scope.go:117] "RemoveContainer" containerID="c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.484497 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37\": container with ID starting with c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37 not found: ID does not exist" containerID="c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.484534 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37"} err="failed to get container status \"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37\": rpc error: code = NotFound desc = could not find container \"c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37\": container with ID starting with c6bf5b021c2c64ca47005fe5e32c02d6f99926e3db8969f841245949a6bc4b37 not found: ID does not exist" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.484576 4901 scope.go:117] "RemoveContainer" containerID="516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.484843 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9\": container with ID starting with 516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9 not found: ID does not exist" containerID="516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.484870 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9"} err="failed to get container status \"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9\": rpc error: code = NotFound desc = could not find container \"516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9\": container with ID starting with 516e167901769e398dac8690d8c3e25c683f9e64148d3c8697aadaa34c6f17d9 not found: ID does not exist" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.484886 4901 scope.go:117] "RemoveContainer" containerID="75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.485243 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f\": container with ID starting with 75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f not found: ID does not exist" containerID="75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.485268 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f"} err="failed to get container status \"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f\": rpc error: code = NotFound desc = could not find container \"75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f\": container with ID starting with 75efefc3ad064b3639494fa4cc1958d7e5dd934f0f2a80ca0f39d1631ba86c4f not found: ID does not exist" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.485284 4901 scope.go:117] "RemoveContainer" containerID="5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.485681 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9\": container with ID starting with 5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9 not found: ID does not exist" containerID="5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.485745 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9"} err="failed to get container status \"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9\": rpc error: code = NotFound desc = could not find container \"5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9\": container with ID starting with 5f23416f544e921ba709eae967c71e664a801d70a7a3d3da33530a7c7c506df9 not found: ID does not exist" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.676291 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.688479 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.705674 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.706263 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-central-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706287 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-central-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.706303 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-notification-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706310 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-notification-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.706332 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="sg-core" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706339 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="sg-core" Feb 02 10:58:04 crc kubenswrapper[4901]: E0202 10:58:04.706362 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="proxy-httpd" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706371 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="proxy-httpd" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706554 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-central-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706592 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="ceilometer-notification-agent" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706608 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="sg-core" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.706619 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" containerName="proxy-httpd" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.708515 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.714302 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.714693 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.728493 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820341 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lzcp\" (UniqueName: \"kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820464 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820499 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820596 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820628 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.820703 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922367 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922495 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922551 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lzcp\" (UniqueName: \"kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922615 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.922750 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.923361 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.924892 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.927831 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.928832 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.928880 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.941727 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:04 crc kubenswrapper[4901]: I0202 10:58:04.945977 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lzcp\" (UniqueName: \"kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp\") pod \"ceilometer-0\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " pod="openstack/ceilometer-0" Feb 02 10:58:05 crc kubenswrapper[4901]: I0202 10:58:05.063501 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:05 crc kubenswrapper[4901]: I0202 10:58:05.553292 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:05 crc kubenswrapper[4901]: I0202 10:58:05.687428 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a" path="/var/lib/kubelet/pods/1d9eeed6-b7b9-48f1-b961-7dd7bbcb861a/volumes" Feb 02 10:58:06 crc kubenswrapper[4901]: I0202 10:58:06.363187 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerStarted","Data":"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b"} Feb 02 10:58:06 crc kubenswrapper[4901]: I0202 10:58:06.363532 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerStarted","Data":"38d4656acfb71a8c5b6328ca70a709ae56d900969f32816ce85ebc28540846ea"} Feb 02 10:58:07 crc kubenswrapper[4901]: I0202 10:58:07.373113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerStarted","Data":"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce"} Feb 02 10:58:08 crc kubenswrapper[4901]: I0202 10:58:08.383963 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerStarted","Data":"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd"} Feb 02 10:58:08 crc kubenswrapper[4901]: E0202 10:58:08.661598 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:08 crc kubenswrapper[4901]: E0202 10:58:08.663000 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:08 crc kubenswrapper[4901]: E0202 10:58:08.664725 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:08 crc kubenswrapper[4901]: E0202 10:58:08.664912 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:11 crc kubenswrapper[4901]: I0202 10:58:11.414609 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerStarted","Data":"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778"} Feb 02 10:58:11 crc kubenswrapper[4901]: I0202 10:58:11.415270 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:58:11 crc kubenswrapper[4901]: I0202 10:58:11.446438 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2160698500000002 podStartE2EDuration="7.446409073s" podCreationTimestamp="2026-02-02 10:58:04 +0000 UTC" firstStartedPulling="2026-02-02 10:58:05.558715199 +0000 UTC m=+1172.577055295" lastFinishedPulling="2026-02-02 10:58:10.789054422 +0000 UTC m=+1177.807394518" observedRunningTime="2026-02-02 10:58:11.440945009 +0000 UTC m=+1178.459285105" watchObservedRunningTime="2026-02-02 10:58:11.446409073 +0000 UTC m=+1178.464749199" Feb 02 10:58:13 crc kubenswrapper[4901]: E0202 10:58:13.662010 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:13 crc kubenswrapper[4901]: E0202 10:58:13.664366 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:13 crc kubenswrapper[4901]: E0202 10:58:13.667225 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:13 crc kubenswrapper[4901]: E0202 10:58:13.667300 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:18 crc kubenswrapper[4901]: E0202 10:58:18.661772 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:18 crc kubenswrapper[4901]: E0202 10:58:18.664006 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:18 crc kubenswrapper[4901]: E0202 10:58:18.665783 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:18 crc kubenswrapper[4901]: E0202 10:58:18.665830 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:23 crc kubenswrapper[4901]: E0202 10:58:23.660937 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:23 crc kubenswrapper[4901]: E0202 10:58:23.664628 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:23 crc kubenswrapper[4901]: E0202 10:58:23.666328 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:58:23 crc kubenswrapper[4901]: E0202 10:58:23.666469 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.592649 4901 generic.go:334] "Generic (PLEG): container finished" podID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" exitCode=137 Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.592731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5869a9a7-8b2b-443e-be13-d92f4db643f0","Type":"ContainerDied","Data":"c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16"} Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.710675 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.771102 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktlj7\" (UniqueName: \"kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7\") pod \"5869a9a7-8b2b-443e-be13-d92f4db643f0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.771207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data\") pod \"5869a9a7-8b2b-443e-be13-d92f4db643f0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.771328 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle\") pod \"5869a9a7-8b2b-443e-be13-d92f4db643f0\" (UID: \"5869a9a7-8b2b-443e-be13-d92f4db643f0\") " Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.780865 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7" (OuterVolumeSpecName: "kube-api-access-ktlj7") pod "5869a9a7-8b2b-443e-be13-d92f4db643f0" (UID: "5869a9a7-8b2b-443e-be13-d92f4db643f0"). InnerVolumeSpecName "kube-api-access-ktlj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.804011 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data" (OuterVolumeSpecName: "config-data") pod "5869a9a7-8b2b-443e-be13-d92f4db643f0" (UID: "5869a9a7-8b2b-443e-be13-d92f4db643f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.810102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5869a9a7-8b2b-443e-be13-d92f4db643f0" (UID: "5869a9a7-8b2b-443e-be13-d92f4db643f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.874273 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.874303 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktlj7\" (UniqueName: \"kubernetes.io/projected/5869a9a7-8b2b-443e-be13-d92f4db643f0-kube-api-access-ktlj7\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:26 crc kubenswrapper[4901]: I0202 10:58:26.874316 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5869a9a7-8b2b-443e-be13-d92f4db643f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.609824 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5869a9a7-8b2b-443e-be13-d92f4db643f0","Type":"ContainerDied","Data":"f5c980f80d27f3da248c5f44bea8e87343c368d493de22e91d263da2bc5a4a84"} Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.609913 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.610295 4901 scope.go:117] "RemoveContainer" containerID="c31496d630db2686222e67e58cb8c108958fb7f2a14aca41bffc205bb7cd4c16" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.659710 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.701492 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.719879 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:58:27 crc kubenswrapper[4901]: E0202 10:58:27.720895 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.720921 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.721201 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" containerName="nova-cell0-conductor-conductor" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.722282 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.732123 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tffs" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.732642 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.735553 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.801620 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.801683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq4lz\" (UniqueName: \"kubernetes.io/projected/256480df-da06-439c-9e9b-41c8aba434a6-kube-api-access-xq4lz\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.801956 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.904636 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.904783 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.904974 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq4lz\" (UniqueName: \"kubernetes.io/projected/256480df-da06-439c-9e9b-41c8aba434a6-kube-api-access-xq4lz\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.911698 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.914012 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256480df-da06-439c-9e9b-41c8aba434a6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:27 crc kubenswrapper[4901]: I0202 10:58:27.928094 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq4lz\" (UniqueName: \"kubernetes.io/projected/256480df-da06-439c-9e9b-41c8aba434a6-kube-api-access-xq4lz\") pod \"nova-cell0-conductor-0\" (UID: \"256480df-da06-439c-9e9b-41c8aba434a6\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:28 crc kubenswrapper[4901]: I0202 10:58:28.060249 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:28 crc kubenswrapper[4901]: I0202 10:58:28.558311 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:58:28 crc kubenswrapper[4901]: I0202 10:58:28.627629 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"256480df-da06-439c-9e9b-41c8aba434a6","Type":"ContainerStarted","Data":"d8a141cbe486b94483426202b60f802d3ce34bd657c6fc15cb0aed7bcbe623e8"} Feb 02 10:58:29 crc kubenswrapper[4901]: I0202 10:58:29.643218 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"256480df-da06-439c-9e9b-41c8aba434a6","Type":"ContainerStarted","Data":"608be3502f818d0a4ef04ea1e681b3015259d0d6dd6650fb3ca6e5c3262dc6ed"} Feb 02 10:58:29 crc kubenswrapper[4901]: I0202 10:58:29.643896 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:29 crc kubenswrapper[4901]: I0202 10:58:29.677611 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.677579922 podStartE2EDuration="2.677579922s" podCreationTimestamp="2026-02-02 10:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:29.666780357 +0000 UTC m=+1196.685120453" watchObservedRunningTime="2026-02-02 10:58:29.677579922 +0000 UTC m=+1196.695920018" Feb 02 10:58:29 crc kubenswrapper[4901]: I0202 10:58:29.688976 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5869a9a7-8b2b-443e-be13-d92f4db643f0" path="/var/lib/kubelet/pods/5869a9a7-8b2b-443e-be13-d92f4db643f0/volumes" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.088247 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.644793 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-px8gs"] Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.647345 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.650272 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.650713 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.660899 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-px8gs"] Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.863556 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.863612 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mtgl\" (UniqueName: \"kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.863717 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.863816 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.864453 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.866430 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.874053 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.907130 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.960056 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.961725 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.973096 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975303 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdfk5\" (UniqueName: \"kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975605 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975726 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975811 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mtgl\" (UniqueName: \"kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.975918 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:33 crc kubenswrapper[4901]: I0202 10:58:33.976042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.001295 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.016503 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.023213 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mtgl\" (UniqueName: \"kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.040811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts\") pod \"nova-cell0-cell-mapping-px8gs\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.054703 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080732 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080791 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdfk5\" (UniqueName: \"kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080822 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080859 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p5ff\" (UniqueName: \"kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080884 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.080981 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.081000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.091966 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.094017 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.100103 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.100476 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.105283 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.140605 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.143861 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdfk5\" (UniqueName: \"kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5\") pod \"nova-scheduler-0\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.171704 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.173010 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191275 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191771 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191838 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsxz\" (UniqueName: \"kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191922 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191960 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.191984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.192003 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.192035 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p5ff\" (UniqueName: \"kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.192716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.197262 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.199236 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.233373 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.242811 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.247596 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p5ff\" (UniqueName: \"kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff\") pod \"nova-metadata-0\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.268630 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.270225 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.275011 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.279021 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.292622 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295089 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsxz\" (UniqueName: \"kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295161 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295211 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295242 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dgd\" (UniqueName: \"kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295306 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.295381 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.296183 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.301662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.309438 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.349984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsxz\" (UniqueName: \"kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz\") pod \"nova-api-0\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399151 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399229 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399281 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399324 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399359 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399400 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399420 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399465 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszll\" (UniqueName: \"kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.399514 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dgd\" (UniqueName: \"kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.407553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.419335 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.440617 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dgd\" (UniqueName: \"kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.501829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.501939 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszll\" (UniqueName: \"kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.502003 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.502051 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.502154 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.502210 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.503346 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.504188 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.504951 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.505556 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.509984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.534461 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszll\" (UniqueName: \"kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll\") pod \"dnsmasq-dns-568d7fd7cf-7x2x6\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.630887 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.663993 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:34 crc kubenswrapper[4901]: I0202 10:58:34.708622 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.026693 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.078601 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:58:35 crc kubenswrapper[4901]: W0202 10:58:35.104505 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5eae9ca_9953_4914_ae4a_538a87fd0ade.slice/crio-053696ff5e0ea6160afb5db0f3d4f51de6c260e408686b3c4ae4a020db6123eb WatchSource:0}: Error finding container 053696ff5e0ea6160afb5db0f3d4f51de6c260e408686b3c4ae4a020db6123eb: Status 404 returned error can't find the container with id 053696ff5e0ea6160afb5db0f3d4f51de6c260e408686b3c4ae4a020db6123eb Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.111123 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.134407 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-px8gs"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.233551 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b7wx4"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.235359 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.242048 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.242953 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.243089 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b7wx4"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.319287 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:35 crc kubenswrapper[4901]: W0202 10:58:35.339490 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12bc7db9_7d12_48e2_b452_6d15334cd44d.slice/crio-cfad305202aba39c458bc5f173200871836afd2e0357ae8c4eb2b8534b5852a7 WatchSource:0}: Error finding container cfad305202aba39c458bc5f173200871836afd2e0357ae8c4eb2b8534b5852a7: Status 404 returned error can't find the container with id cfad305202aba39c458bc5f173200871836afd2e0357ae8c4eb2b8534b5852a7 Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.368768 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.368867 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.368893 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-942xs\" (UniqueName: \"kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.369897 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.472376 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.472444 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-942xs\" (UniqueName: \"kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.472592 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.472633 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.486519 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.487553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.495802 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.504343 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-942xs\" (UniqueName: \"kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs\") pod \"nova-cell1-conductor-db-sync-b7wx4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.512832 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.542272 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.621291 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.852985 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2700af84-ba53-4dbe-970b-13c8f398461e","Type":"ContainerStarted","Data":"e13b828bf74021888929822d188ba1348f183d62b4110c90b6c59662ec6dd0f0"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.856154 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab42463-80dc-4ab5-8af6-83ef6bc0bb43","Type":"ContainerStarted","Data":"c29afb1cdff6569cc02831a2a9dccc6c8c9bba785c65cd95ff8b88f83c15014b"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.858152 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerStarted","Data":"053696ff5e0ea6160afb5db0f3d4f51de6c260e408686b3c4ae4a020db6123eb"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.860615 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" event={"ID":"6a9df7ea-8e3d-4c6e-a86d-793168703b18","Type":"ContainerStarted","Data":"41cd1989f4b53139f24917d8f70419b8e0147613619f056bf0b83bf0bcbc6bdf"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.863385 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerStarted","Data":"cfad305202aba39c458bc5f173200871836afd2e0357ae8c4eb2b8534b5852a7"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.870467 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-px8gs" event={"ID":"d8358c85-6b94-4c56-911c-3396c7e780e5","Type":"ContainerStarted","Data":"6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.870535 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-px8gs" event={"ID":"d8358c85-6b94-4c56-911c-3396c7e780e5","Type":"ContainerStarted","Data":"975d4df1ab1454754e5e708c0621d800b6beef82dc552d8c4fd3e5fbf88ad9e4"} Feb 02 10:58:35 crc kubenswrapper[4901]: I0202 10:58:35.915104 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-px8gs" podStartSLOduration=2.915086725 podStartE2EDuration="2.915086725s" podCreationTimestamp="2026-02-02 10:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:35.912372169 +0000 UTC m=+1202.930712265" watchObservedRunningTime="2026-02-02 10:58:35.915086725 +0000 UTC m=+1202.933426821" Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.160307 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b7wx4"] Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.882510 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" event={"ID":"ee49f3db-ba4a-4c9c-893e-edde04607cf4","Type":"ContainerStarted","Data":"9a9695c64faa468b97a3f94bfbf166c1e23c20daf90103f098e6af798c72619d"} Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.883099 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" event={"ID":"ee49f3db-ba4a-4c9c-893e-edde04607cf4","Type":"ContainerStarted","Data":"af376027b0a461d7b06a45f55b0cdd524555bbb9518f8fb70fce8cf9d3a02578"} Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.889668 4901 generic.go:334] "Generic (PLEG): container finished" podID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerID="0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54" exitCode=0 Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.889854 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" event={"ID":"6a9df7ea-8e3d-4c6e-a86d-793168703b18","Type":"ContainerDied","Data":"0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54"} Feb 02 10:58:36 crc kubenswrapper[4901]: I0202 10:58:36.921470 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" podStartSLOduration=1.921445558 podStartE2EDuration="1.921445558s" podCreationTimestamp="2026-02-02 10:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:36.913185934 +0000 UTC m=+1203.931526030" watchObservedRunningTime="2026-02-02 10:58:36.921445558 +0000 UTC m=+1203.939785654" Feb 02 10:58:37 crc kubenswrapper[4901]: I0202 10:58:37.512904 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:37 crc kubenswrapper[4901]: I0202 10:58:37.524729 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:58:37 crc kubenswrapper[4901]: I0202 10:58:37.837208 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:58:37 crc kubenswrapper[4901]: I0202 10:58:37.837526 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.943590 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerStarted","Data":"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.944404 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerStarted","Data":"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.944616 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-log" containerID="cri-o://cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" gracePeriod=30 Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.944732 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-metadata" containerID="cri-o://f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" gracePeriod=30 Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.972075 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" event={"ID":"6a9df7ea-8e3d-4c6e-a86d-793168703b18","Type":"ContainerStarted","Data":"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.975683 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.978361 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerStarted","Data":"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.978524 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerStarted","Data":"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.984919 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2700af84-ba53-4dbe-970b-13c8f398461e","Type":"ContainerStarted","Data":"05912d87d9bd02f1cc666a84703c2c5e0534df3ce47a1a7c4fdcccab3e79f089"} Feb 02 10:58:39 crc kubenswrapper[4901]: I0202 10:58:39.985208 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2700af84-ba53-4dbe-970b-13c8f398461e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://05912d87d9bd02f1cc666a84703c2c5e0534df3ce47a1a7c4fdcccab3e79f089" gracePeriod=30 Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.002266 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab42463-80dc-4ab5-8af6-83ef6bc0bb43","Type":"ContainerStarted","Data":"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965"} Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.008582 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.311907962 podStartE2EDuration="7.008534406s" podCreationTimestamp="2026-02-02 10:58:33 +0000 UTC" firstStartedPulling="2026-02-02 10:58:35.111431577 +0000 UTC m=+1202.129771663" lastFinishedPulling="2026-02-02 10:58:38.808058011 +0000 UTC m=+1205.826398107" observedRunningTime="2026-02-02 10:58:40.005938832 +0000 UTC m=+1207.024278948" watchObservedRunningTime="2026-02-02 10:58:40.008534406 +0000 UTC m=+1207.026874512" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.036583 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.222415501 podStartE2EDuration="7.036534424s" podCreationTimestamp="2026-02-02 10:58:33 +0000 UTC" firstStartedPulling="2026-02-02 10:58:34.993946288 +0000 UTC m=+1202.012286384" lastFinishedPulling="2026-02-02 10:58:38.808065211 +0000 UTC m=+1205.826405307" observedRunningTime="2026-02-02 10:58:40.028797734 +0000 UTC m=+1207.047137850" watchObservedRunningTime="2026-02-02 10:58:40.036534424 +0000 UTC m=+1207.054874520" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.059246 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.780848331 podStartE2EDuration="6.059220722s" podCreationTimestamp="2026-02-02 10:58:34 +0000 UTC" firstStartedPulling="2026-02-02 10:58:35.531147856 +0000 UTC m=+1202.549487952" lastFinishedPulling="2026-02-02 10:58:38.809520237 +0000 UTC m=+1205.827860343" observedRunningTime="2026-02-02 10:58:40.048690313 +0000 UTC m=+1207.067030409" watchObservedRunningTime="2026-02-02 10:58:40.059220722 +0000 UTC m=+1207.077560808" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.090540 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" podStartSLOduration=6.090518471 podStartE2EDuration="6.090518471s" podCreationTimestamp="2026-02-02 10:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:40.074410686 +0000 UTC m=+1207.092750782" watchObservedRunningTime="2026-02-02 10:58:40.090518471 +0000 UTC m=+1207.108858567" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.125878 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.657343316 podStartE2EDuration="7.125855841s" podCreationTimestamp="2026-02-02 10:58:33 +0000 UTC" firstStartedPulling="2026-02-02 10:58:35.34336432 +0000 UTC m=+1202.361704416" lastFinishedPulling="2026-02-02 10:58:38.811876835 +0000 UTC m=+1205.830216941" observedRunningTime="2026-02-02 10:58:40.117051233 +0000 UTC m=+1207.135391319" watchObservedRunningTime="2026-02-02 10:58:40.125855841 +0000 UTC m=+1207.144195937" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.449475 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.449809 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" containerName="kube-state-metrics" containerID="cri-o://31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9" gracePeriod=30 Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.651779 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.804914 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle\") pod \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.804974 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p5ff\" (UniqueName: \"kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff\") pod \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.807599 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs\") pod \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.807665 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data\") pod \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\" (UID: \"e5eae9ca-9953-4914-ae4a-538a87fd0ade\") " Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.809792 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs" (OuterVolumeSpecName: "logs") pod "e5eae9ca-9953-4914-ae4a-538a87fd0ade" (UID: "e5eae9ca-9953-4914-ae4a-538a87fd0ade"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.812784 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff" (OuterVolumeSpecName: "kube-api-access-8p5ff") pod "e5eae9ca-9953-4914-ae4a-538a87fd0ade" (UID: "e5eae9ca-9953-4914-ae4a-538a87fd0ade"). InnerVolumeSpecName "kube-api-access-8p5ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.852654 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5eae9ca-9953-4914-ae4a-538a87fd0ade" (UID: "e5eae9ca-9953-4914-ae4a-538a87fd0ade"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.855349 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data" (OuterVolumeSpecName: "config-data") pod "e5eae9ca-9953-4914-ae4a-538a87fd0ade" (UID: "e5eae9ca-9953-4914-ae4a-538a87fd0ade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.911221 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eae9ca-9953-4914-ae4a-538a87fd0ade-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.911254 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.911263 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eae9ca-9953-4914-ae4a-538a87fd0ade-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.911275 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p5ff\" (UniqueName: \"kubernetes.io/projected/e5eae9ca-9953-4914-ae4a-538a87fd0ade-kube-api-access-8p5ff\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:40 crc kubenswrapper[4901]: I0202 10:58:40.937852 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.015091 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b58hw\" (UniqueName: \"kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw\") pod \"e251c5c1-5608-4bf0-8bfa-c056084f0bc0\" (UID: \"e251c5c1-5608-4bf0-8bfa-c056084f0bc0\") " Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023353 4901 generic.go:334] "Generic (PLEG): container finished" podID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerID="f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" exitCode=0 Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023408 4901 generic.go:334] "Generic (PLEG): container finished" podID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerID="cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" exitCode=143 Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023503 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerDied","Data":"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89"} Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023548 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerDied","Data":"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e"} Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023667 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5eae9ca-9953-4914-ae4a-538a87fd0ade","Type":"ContainerDied","Data":"053696ff5e0ea6160afb5db0f3d4f51de6c260e408686b3c4ae4a020db6123eb"} Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023703 4901 scope.go:117] "RemoveContainer" containerID="f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.023957 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.031480 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw" (OuterVolumeSpecName: "kube-api-access-b58hw") pod "e251c5c1-5608-4bf0-8bfa-c056084f0bc0" (UID: "e251c5c1-5608-4bf0-8bfa-c056084f0bc0"). InnerVolumeSpecName "kube-api-access-b58hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.033371 4901 generic.go:334] "Generic (PLEG): container finished" podID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" containerID="31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9" exitCode=2 Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.033446 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e251c5c1-5608-4bf0-8bfa-c056084f0bc0","Type":"ContainerDied","Data":"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9"} Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.033476 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.033516 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e251c5c1-5608-4bf0-8bfa-c056084f0bc0","Type":"ContainerDied","Data":"9b85bfe6b1bb8263cd7c1ef1c158b9b5e0eaf1897bf376396954c00fb954d798"} Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.118001 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b58hw\" (UniqueName: \"kubernetes.io/projected/e251c5c1-5608-4bf0-8bfa-c056084f0bc0-kube-api-access-b58hw\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.128135 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.131088 4901 scope.go:117] "RemoveContainer" containerID="cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.160909 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.187405 4901 scope.go:117] "RemoveContainer" containerID="f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.192805 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89\": container with ID starting with f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89 not found: ID does not exist" containerID="f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.192853 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89"} err="failed to get container status \"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89\": rpc error: code = NotFound desc = could not find container \"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89\": container with ID starting with f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89 not found: ID does not exist" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.192881 4901 scope.go:117] "RemoveContainer" containerID="cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.193128 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e\": container with ID starting with cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e not found: ID does not exist" containerID="cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193156 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e"} err="failed to get container status \"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e\": rpc error: code = NotFound desc = could not find container \"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e\": container with ID starting with cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e not found: ID does not exist" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193172 4901 scope.go:117] "RemoveContainer" containerID="f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193366 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89"} err="failed to get container status \"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89\": rpc error: code = NotFound desc = could not find container \"f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89\": container with ID starting with f39002c9b0900e14d3d8b7c41879803cfcbec4db92e54425f9c0bc6557fa3b89 not found: ID does not exist" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193388 4901 scope.go:117] "RemoveContainer" containerID="cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193858 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e"} err="failed to get container status \"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e\": rpc error: code = NotFound desc = could not find container \"cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e\": container with ID starting with cc8ef678f668808cab2f5a326e7ea7ec8c05cf893f363ac5d7552a4a329edf7e not found: ID does not exist" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.193916 4901 scope.go:117] "RemoveContainer" containerID="31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.204094 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.232895 4901 scope.go:117] "RemoveContainer" containerID="31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9" Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.233577 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9\": container with ID starting with 31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9 not found: ID does not exist" containerID="31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.233626 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9"} err="failed to get container status \"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9\": rpc error: code = NotFound desc = could not find container \"31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9\": container with ID starting with 31f7c790d300e6854aab92a4f38bb256be6a94388f51d165d89e6e9de91654d9 not found: ID does not exist" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.237636 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.248654 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.249290 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" containerName="kube-state-metrics" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249312 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" containerName="kube-state-metrics" Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.249332 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-metadata" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249339 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-metadata" Feb 02 10:58:41 crc kubenswrapper[4901]: E0202 10:58:41.249367 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-log" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249374 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-log" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249607 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" containerName="kube-state-metrics" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249639 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-metadata" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.249656 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" containerName="nova-metadata-log" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.251033 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.255088 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.255399 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.279153 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.298985 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.301132 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.303346 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.304179 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.315631 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.322500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jsxb\" (UniqueName: \"kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.322616 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.322699 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.322725 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.322748 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424585 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb68r\" (UniqueName: \"kubernetes.io/projected/8faaac48-521a-47fc-b480-d941fd41be94-kube-api-access-bb68r\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424666 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424697 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424719 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424740 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424811 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424839 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424870 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jsxb\" (UniqueName: \"kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.424910 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.425336 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.430179 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.430858 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.439483 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.443498 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jsxb\" (UniqueName: \"kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb\") pod \"nova-metadata-0\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.527415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.527600 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.527646 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.527769 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb68r\" (UniqueName: \"kubernetes.io/projected/8faaac48-521a-47fc-b480-d941fd41be94-kube-api-access-bb68r\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.535398 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.535575 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.535909 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8faaac48-521a-47fc-b480-d941fd41be94-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.546643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb68r\" (UniqueName: \"kubernetes.io/projected/8faaac48-521a-47fc-b480-d941fd41be94-kube-api-access-bb68r\") pod \"kube-state-metrics-0\" (UID: \"8faaac48-521a-47fc-b480-d941fd41be94\") " pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.590104 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.627581 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.719875 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e251c5c1-5608-4bf0-8bfa-c056084f0bc0" path="/var/lib/kubelet/pods/e251c5c1-5608-4bf0-8bfa-c056084f0bc0/volumes" Feb 02 10:58:41 crc kubenswrapper[4901]: I0202 10:58:41.720968 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eae9ca-9953-4914-ae4a-538a87fd0ade" path="/var/lib/kubelet/pods/e5eae9ca-9953-4914-ae4a-538a87fd0ade/volumes" Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.163481 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:42 crc kubenswrapper[4901]: W0202 10:58:42.261027 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8faaac48_521a_47fc_b480_d941fd41be94.slice/crio-23d998a746405c48730557651d36c401fcbbb4b7b98eb231e7ba01e8be09871b WatchSource:0}: Error finding container 23d998a746405c48730557651d36c401fcbbb4b7b98eb231e7ba01e8be09871b: Status 404 returned error can't find the container with id 23d998a746405c48730557651d36c401fcbbb4b7b98eb231e7ba01e8be09871b Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.261351 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.812575 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.813829 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-central-agent" containerID="cri-o://886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b" gracePeriod=30 Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.814034 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="proxy-httpd" containerID="cri-o://20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778" gracePeriod=30 Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.814079 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="sg-core" containerID="cri-o://b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd" gracePeriod=30 Feb 02 10:58:42 crc kubenswrapper[4901]: I0202 10:58:42.814118 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-notification-agent" containerID="cri-o://9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce" gracePeriod=30 Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.072870 4901 generic.go:334] "Generic (PLEG): container finished" podID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerID="20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778" exitCode=0 Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.072904 4901 generic.go:334] "Generic (PLEG): container finished" podID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerID="b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd" exitCode=2 Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.072947 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerDied","Data":"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.072976 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerDied","Data":"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.094714 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerStarted","Data":"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.094757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerStarted","Data":"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.094768 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerStarted","Data":"e127094b4e44619f7680de091216245add0044a4592821f78243cfcb84bad069"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.105084 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8faaac48-521a-47fc-b480-d941fd41be94","Type":"ContainerStarted","Data":"637a809d111daaf3144d78c81d5b2fef39272c87af56b42eb7f17bb4f1fcc577"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.105120 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8faaac48-521a-47fc-b480-d941fd41be94","Type":"ContainerStarted","Data":"23d998a746405c48730557651d36c401fcbbb4b7b98eb231e7ba01e8be09871b"} Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.105702 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.130650 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.130634335 podStartE2EDuration="2.130634335s" podCreationTimestamp="2026-02-02 10:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:43.12635105 +0000 UTC m=+1210.144691146" watchObservedRunningTime="2026-02-02 10:58:43.130634335 +0000 UTC m=+1210.148974431" Feb 02 10:58:43 crc kubenswrapper[4901]: I0202 10:58:43.154122 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.788197195 podStartE2EDuration="2.154105661s" podCreationTimestamp="2026-02-02 10:58:41 +0000 UTC" firstStartedPulling="2026-02-02 10:58:42.264857579 +0000 UTC m=+1209.283197675" lastFinishedPulling="2026-02-02 10:58:42.630766045 +0000 UTC m=+1209.649106141" observedRunningTime="2026-02-02 10:58:43.149868367 +0000 UTC m=+1210.168208493" watchObservedRunningTime="2026-02-02 10:58:43.154105661 +0000 UTC m=+1210.172445757" Feb 02 10:58:43 crc kubenswrapper[4901]: E0202 10:58:43.575424 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8358c85_6b94_4c56_911c_3396c7e780e5.slice/crio-conmon-6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8358c85_6b94_4c56_911c_3396c7e780e5.slice/crio-6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.120130 4901 generic.go:334] "Generic (PLEG): container finished" podID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerID="886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b" exitCode=0 Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.120219 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerDied","Data":"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b"} Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.122459 4901 generic.go:334] "Generic (PLEG): container finished" podID="d8358c85-6b94-4c56-911c-3396c7e780e5" containerID="6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229" exitCode=0 Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.122543 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-px8gs" event={"ID":"d8358c85-6b94-4c56-911c-3396c7e780e5","Type":"ContainerDied","Data":"6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229"} Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.200183 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.200225 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.253103 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.632259 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.632554 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.664814 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.711763 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.777805 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:58:44 crc kubenswrapper[4901]: I0202 10:58:44.778084 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="dnsmasq-dns" containerID="cri-o://fbe779a7fab448eab02904d83b4382a2af4150de16403dce77b56b73af504f09" gracePeriod=10 Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.145369 4901 generic.go:334] "Generic (PLEG): container finished" podID="399febbb-a010-490d-bdf3-f80f74257dea" containerID="fbe779a7fab448eab02904d83b4382a2af4150de16403dce77b56b73af504f09" exitCode=0 Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.145457 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" event={"ID":"399febbb-a010-490d-bdf3-f80f74257dea","Type":"ContainerDied","Data":"fbe779a7fab448eab02904d83b4382a2af4150de16403dce77b56b73af504f09"} Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.149257 4901 generic.go:334] "Generic (PLEG): container finished" podID="ee49f3db-ba4a-4c9c-893e-edde04607cf4" containerID="9a9695c64faa468b97a3f94bfbf166c1e23c20daf90103f098e6af798c72619d" exitCode=0 Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.150263 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" event={"ID":"ee49f3db-ba4a-4c9c-893e-edde04607cf4","Type":"ContainerDied","Data":"9a9695c64faa468b97a3f94bfbf166c1e23c20daf90103f098e6af798c72619d"} Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.225478 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.339094 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415336 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhcxt\" (UniqueName: \"kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415434 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415513 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415583 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415633 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.415668 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb\") pod \"399febbb-a010-490d-bdf3-f80f74257dea\" (UID: \"399febbb-a010-490d-bdf3-f80f74257dea\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.449492 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt" (OuterVolumeSpecName: "kube-api-access-nhcxt") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "kube-api-access-nhcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.506114 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.509728 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.515351 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.518966 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhcxt\" (UniqueName: \"kubernetes.io/projected/399febbb-a010-490d-bdf3-f80f74257dea-kube-api-access-nhcxt\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.518991 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.519002 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.519011 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.530493 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.556744 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config" (OuterVolumeSpecName: "config") pod "399febbb-a010-490d-bdf3-f80f74257dea" (UID: "399febbb-a010-490d-bdf3-f80f74257dea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.557789 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.620393 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mtgl\" (UniqueName: \"kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl\") pod \"d8358c85-6b94-4c56-911c-3396c7e780e5\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.620807 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data\") pod \"d8358c85-6b94-4c56-911c-3396c7e780e5\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.620953 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts\") pod \"d8358c85-6b94-4c56-911c-3396c7e780e5\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.621059 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle\") pod \"d8358c85-6b94-4c56-911c-3396c7e780e5\" (UID: \"d8358c85-6b94-4c56-911c-3396c7e780e5\") " Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.621700 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.621783 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/399febbb-a010-490d-bdf3-f80f74257dea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.624243 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl" (OuterVolumeSpecName: "kube-api-access-8mtgl") pod "d8358c85-6b94-4c56-911c-3396c7e780e5" (UID: "d8358c85-6b94-4c56-911c-3396c7e780e5"). InnerVolumeSpecName "kube-api-access-8mtgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.624505 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts" (OuterVolumeSpecName: "scripts") pod "d8358c85-6b94-4c56-911c-3396c7e780e5" (UID: "d8358c85-6b94-4c56-911c-3396c7e780e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.646946 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data" (OuterVolumeSpecName: "config-data") pod "d8358c85-6b94-4c56-911c-3396c7e780e5" (UID: "d8358c85-6b94-4c56-911c-3396c7e780e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.648079 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8358c85-6b94-4c56-911c-3396c7e780e5" (UID: "d8358c85-6b94-4c56-911c-3396c7e780e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.672775 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.713748 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.723717 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.723756 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.723770 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8358c85-6b94-4c56-911c-3396c7e780e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:45 crc kubenswrapper[4901]: I0202 10:58:45.723783 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mtgl\" (UniqueName: \"kubernetes.io/projected/d8358c85-6b94-4c56-911c-3396c7e780e5-kube-api-access-8mtgl\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.158084 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-px8gs" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.158080 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-px8gs" event={"ID":"d8358c85-6b94-4c56-911c-3396c7e780e5","Type":"ContainerDied","Data":"975d4df1ab1454754e5e708c0621d800b6beef82dc552d8c4fd3e5fbf88ad9e4"} Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.158429 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="975d4df1ab1454754e5e708c0621d800b6beef82dc552d8c4fd3e5fbf88ad9e4" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.160432 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" event={"ID":"399febbb-a010-490d-bdf3-f80f74257dea","Type":"ContainerDied","Data":"f93b17c8e476a80d37d55d09349d4f9c5461171b30a8661e4a574d07dd8107e4"} Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.160467 4901 scope.go:117] "RemoveContainer" containerID="fbe779a7fab448eab02904d83b4382a2af4150de16403dce77b56b73af504f09" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.160514 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688b9f5b49-6kjl6" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.208317 4901 scope.go:117] "RemoveContainer" containerID="fe2e0524953aedfb2e4fda3b3a33d4737c87c6ed3d490487cc364b1efddaa3e2" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.213896 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.224785 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688b9f5b49-6kjl6"] Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.350863 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.351123 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-log" containerID="cri-o://6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4" gracePeriod=30 Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.351598 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-api" containerID="cri-o://c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632" gracePeriod=30 Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.373606 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.377299 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.379210 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-metadata" containerID="cri-o://7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" gracePeriod=30 Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.377528 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-log" containerID="cri-o://31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" gracePeriod=30 Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.590954 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.590995 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.671741 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.749207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-942xs\" (UniqueName: \"kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs\") pod \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.749361 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle\") pod \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.749402 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts\") pod \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.749423 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data\") pod \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\" (UID: \"ee49f3db-ba4a-4c9c-893e-edde04607cf4\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.757549 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs" (OuterVolumeSpecName: "kube-api-access-942xs") pod "ee49f3db-ba4a-4c9c-893e-edde04607cf4" (UID: "ee49f3db-ba4a-4c9c-893e-edde04607cf4"). InnerVolumeSpecName "kube-api-access-942xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.762812 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts" (OuterVolumeSpecName: "scripts") pod "ee49f3db-ba4a-4c9c-893e-edde04607cf4" (UID: "ee49f3db-ba4a-4c9c-893e-edde04607cf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.786356 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee49f3db-ba4a-4c9c-893e-edde04607cf4" (UID: "ee49f3db-ba4a-4c9c-893e-edde04607cf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.818390 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data" (OuterVolumeSpecName: "config-data") pod "ee49f3db-ba4a-4c9c-893e-edde04607cf4" (UID: "ee49f3db-ba4a-4c9c-893e-edde04607cf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.852896 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-942xs\" (UniqueName: \"kubernetes.io/projected/ee49f3db-ba4a-4c9c-893e-edde04607cf4-kube-api-access-942xs\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.852940 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.852951 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.852960 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee49f3db-ba4a-4c9c-893e-edde04607cf4-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.891263 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.955063 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jsxb\" (UniqueName: \"kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb\") pod \"476da10a-4dfe-42fa-825d-e6a0db89271d\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.955215 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs\") pod \"476da10a-4dfe-42fa-825d-e6a0db89271d\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.955260 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs\") pod \"476da10a-4dfe-42fa-825d-e6a0db89271d\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.955473 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data\") pod \"476da10a-4dfe-42fa-825d-e6a0db89271d\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.955581 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle\") pod \"476da10a-4dfe-42fa-825d-e6a0db89271d\" (UID: \"476da10a-4dfe-42fa-825d-e6a0db89271d\") " Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.961057 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs" (OuterVolumeSpecName: "logs") pod "476da10a-4dfe-42fa-825d-e6a0db89271d" (UID: "476da10a-4dfe-42fa-825d-e6a0db89271d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:46 crc kubenswrapper[4901]: I0202 10:58:46.965222 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb" (OuterVolumeSpecName: "kube-api-access-2jsxb") pod "476da10a-4dfe-42fa-825d-e6a0db89271d" (UID: "476da10a-4dfe-42fa-825d-e6a0db89271d"). InnerVolumeSpecName "kube-api-access-2jsxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.011332 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "476da10a-4dfe-42fa-825d-e6a0db89271d" (UID: "476da10a-4dfe-42fa-825d-e6a0db89271d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.013013 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data" (OuterVolumeSpecName: "config-data") pod "476da10a-4dfe-42fa-825d-e6a0db89271d" (UID: "476da10a-4dfe-42fa-825d-e6a0db89271d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.031311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "476da10a-4dfe-42fa-825d-e6a0db89271d" (UID: "476da10a-4dfe-42fa-825d-e6a0db89271d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.058095 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jsxb\" (UniqueName: \"kubernetes.io/projected/476da10a-4dfe-42fa-825d-e6a0db89271d-kube-api-access-2jsxb\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.058131 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.058143 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/476da10a-4dfe-42fa-825d-e6a0db89271d-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.058152 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.058160 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/476da10a-4dfe-42fa-825d-e6a0db89271d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175583 4901 generic.go:334] "Generic (PLEG): container finished" podID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerID="7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" exitCode=0 Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175626 4901 generic.go:334] "Generic (PLEG): container finished" podID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerID="31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" exitCode=143 Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175636 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175677 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerDied","Data":"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1"} Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175730 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerDied","Data":"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe"} Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175743 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"476da10a-4dfe-42fa-825d-e6a0db89271d","Type":"ContainerDied","Data":"e127094b4e44619f7680de091216245add0044a4592821f78243cfcb84bad069"} Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.175765 4901 scope.go:117] "RemoveContainer" containerID="7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.178706 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" event={"ID":"ee49f3db-ba4a-4c9c-893e-edde04607cf4","Type":"ContainerDied","Data":"af376027b0a461d7b06a45f55b0cdd524555bbb9518f8fb70fce8cf9d3a02578"} Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.178737 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af376027b0a461d7b06a45f55b0cdd524555bbb9518f8fb70fce8cf9d3a02578" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.178770 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b7wx4" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.184643 4901 generic.go:334] "Generic (PLEG): container finished" podID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerID="6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4" exitCode=143 Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.184725 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerDied","Data":"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4"} Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.184949 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerName="nova-scheduler-scheduler" containerID="cri-o://852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" gracePeriod=30 Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.210302 4901 scope.go:117] "RemoveContainer" containerID="31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.232632 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.252725 4901 scope.go:117] "RemoveContainer" containerID="7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.253166 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1\": container with ID starting with 7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1 not found: ID does not exist" containerID="7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253197 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1"} err="failed to get container status \"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1\": rpc error: code = NotFound desc = could not find container \"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1\": container with ID starting with 7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1 not found: ID does not exist" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253215 4901 scope.go:117] "RemoveContainer" containerID="31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.253400 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe\": container with ID starting with 31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe not found: ID does not exist" containerID="31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253423 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe"} err="failed to get container status \"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe\": rpc error: code = NotFound desc = could not find container \"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe\": container with ID starting with 31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe not found: ID does not exist" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253437 4901 scope.go:117] "RemoveContainer" containerID="7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253617 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1"} err="failed to get container status \"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1\": rpc error: code = NotFound desc = could not find container \"7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1\": container with ID starting with 7cd17ba1680c0ef5e3311efde433ec60883fe60cf455daa6fc1d1558be94caf1 not found: ID does not exist" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253637 4901 scope.go:117] "RemoveContainer" containerID="31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.253805 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe"} err="failed to get container status \"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe\": rpc error: code = NotFound desc = could not find container \"31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe\": container with ID starting with 31bc3fd1464fec01a7092b7bb700830355eff0d7ad96e59649564f0abcac91fe not found: ID does not exist" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.259496 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.272644 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273089 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="dnsmasq-dns" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273107 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="dnsmasq-dns" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273132 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-metadata" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273138 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-metadata" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273151 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8358c85-6b94-4c56-911c-3396c7e780e5" containerName="nova-manage" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273157 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8358c85-6b94-4c56-911c-3396c7e780e5" containerName="nova-manage" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273171 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="init" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273177 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="init" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273187 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-log" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273193 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-log" Feb 02 10:58:47 crc kubenswrapper[4901]: E0202 10:58:47.273204 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee49f3db-ba4a-4c9c-893e-edde04607cf4" containerName="nova-cell1-conductor-db-sync" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273210 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee49f3db-ba4a-4c9c-893e-edde04607cf4" containerName="nova-cell1-conductor-db-sync" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273386 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="399febbb-a010-490d-bdf3-f80f74257dea" containerName="dnsmasq-dns" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273404 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-metadata" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273414 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8358c85-6b94-4c56-911c-3396c7e780e5" containerName="nova-manage" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273422 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee49f3db-ba4a-4c9c-893e-edde04607cf4" containerName="nova-cell1-conductor-db-sync" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.273436 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" containerName="nova-metadata-log" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.274621 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.277232 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.278175 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.282973 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.291033 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.306015 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.306208 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.312318 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.367800 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.367861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.367919 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.367959 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.368369 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.368506 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.368612 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrvp\" (UniqueName: \"kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.368737 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9tk5\" (UniqueName: \"kubernetes.io/projected/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-kube-api-access-f9tk5\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrvp\" (UniqueName: \"kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470362 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9tk5\" (UniqueName: \"kubernetes.io/projected/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-kube-api-access-f9tk5\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470392 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470420 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470464 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.470498 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.471719 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.487070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.488426 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.488449 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.490240 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.491235 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.500613 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9tk5\" (UniqueName: \"kubernetes.io/projected/7f1bc22b-987a-4dd9-a0c9-42383e63fd38-kube-api-access-f9tk5\") pod \"nova-cell1-conductor-0\" (UID: \"7f1bc22b-987a-4dd9-a0c9-42383e63fd38\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.501072 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrvp\" (UniqueName: \"kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp\") pod \"nova-metadata-0\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.598667 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.628135 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.695113 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399febbb-a010-490d-bdf3-f80f74257dea" path="/var/lib/kubelet/pods/399febbb-a010-490d-bdf3-f80f74257dea/volumes" Feb 02 10:58:47 crc kubenswrapper[4901]: I0202 10:58:47.696207 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476da10a-4dfe-42fa-825d-e6a0db89271d" path="/var/lib/kubelet/pods/476da10a-4dfe-42fa-825d-e6a0db89271d/volumes" Feb 02 10:58:48 crc kubenswrapper[4901]: W0202 10:58:48.156868 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1bc22b_987a_4dd9_a0c9_42383e63fd38.slice/crio-04752c9918ca5a3125bb305707820235a35c92a7f8ac62c0281e0e82c034657d WatchSource:0}: Error finding container 04752c9918ca5a3125bb305707820235a35c92a7f8ac62c0281e0e82c034657d: Status 404 returned error can't find the container with id 04752c9918ca5a3125bb305707820235a35c92a7f8ac62c0281e0e82c034657d Feb 02 10:58:48 crc kubenswrapper[4901]: I0202 10:58:48.160801 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:58:48 crc kubenswrapper[4901]: W0202 10:58:48.177366 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4825fb51_1602_4350_89ed_9bc8cea66c2c.slice/crio-5942469404a218679d59b4804d1d605c66ec659fab9d0af4cbbd21fded32cae0 WatchSource:0}: Error finding container 5942469404a218679d59b4804d1d605c66ec659fab9d0af4cbbd21fded32cae0: Status 404 returned error can't find the container with id 5942469404a218679d59b4804d1d605c66ec659fab9d0af4cbbd21fded32cae0 Feb 02 10:58:48 crc kubenswrapper[4901]: I0202 10:58:48.180248 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:58:48 crc kubenswrapper[4901]: I0202 10:58:48.197732 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerStarted","Data":"5942469404a218679d59b4804d1d605c66ec659fab9d0af4cbbd21fded32cae0"} Feb 02 10:58:48 crc kubenswrapper[4901]: I0202 10:58:48.204673 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7f1bc22b-987a-4dd9-a0c9-42383e63fd38","Type":"ContainerStarted","Data":"04752c9918ca5a3125bb305707820235a35c92a7f8ac62c0281e0e82c034657d"} Feb 02 10:58:49 crc kubenswrapper[4901]: E0202 10:58:49.202171 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:58:49 crc kubenswrapper[4901]: E0202 10:58:49.206531 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:58:49 crc kubenswrapper[4901]: E0202 10:58:49.208407 4901 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:58:49 crc kubenswrapper[4901]: E0202 10:58:49.208456 4901 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerName="nova-scheduler-scheduler" Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.221726 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7f1bc22b-987a-4dd9-a0c9-42383e63fd38","Type":"ContainerStarted","Data":"919a97fc6f6c9ae27321d4e17a6b7bcb5137ce3333cd2af66fa453bdabb31615"} Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.223052 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.225784 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerStarted","Data":"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa"} Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.225877 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerStarted","Data":"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0"} Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.260772 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.2607507780000002 podStartE2EDuration="2.260750778s" podCreationTimestamp="2026-02-02 10:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:49.253717265 +0000 UTC m=+1216.272057431" watchObservedRunningTime="2026-02-02 10:58:49.260750778 +0000 UTC m=+1216.279090884" Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.297785 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.297756707 podStartE2EDuration="2.297756707s" podCreationTimestamp="2026-02-02 10:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:49.284181374 +0000 UTC m=+1216.302521490" watchObservedRunningTime="2026-02-02 10:58:49.297756707 +0000 UTC m=+1216.316096793" Feb 02 10:58:49 crc kubenswrapper[4901]: I0202 10:58:49.908708 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025541 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025642 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025730 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025823 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lzcp\" (UniqueName: \"kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025842 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025953 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.025995 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd\") pod \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\" (UID: \"52c8aa9b-cedf-433f-8e40-5156c9fe53a2\") " Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.027002 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.027032 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.033257 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts" (OuterVolumeSpecName: "scripts") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.035045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp" (OuterVolumeSpecName: "kube-api-access-2lzcp") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "kube-api-access-2lzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.065509 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.120587 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128586 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128617 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128629 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128644 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lzcp\" (UniqueName: \"kubernetes.io/projected/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-kube-api-access-2lzcp\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128662 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.128673 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.167144 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data" (OuterVolumeSpecName: "config-data") pod "52c8aa9b-cedf-433f-8e40-5156c9fe53a2" (UID: "52c8aa9b-cedf-433f-8e40-5156c9fe53a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.231472 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8aa9b-cedf-433f-8e40-5156c9fe53a2-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.246442 4901 generic.go:334] "Generic (PLEG): container finished" podID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerID="9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce" exitCode=0 Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.247798 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.247785 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerDied","Data":"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce"} Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.248005 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52c8aa9b-cedf-433f-8e40-5156c9fe53a2","Type":"ContainerDied","Data":"38d4656acfb71a8c5b6328ca70a709ae56d900969f32816ce85ebc28540846ea"} Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.248047 4901 scope.go:117] "RemoveContainer" containerID="20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.277183 4901 scope.go:117] "RemoveContainer" containerID="b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.286654 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.307217 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.316298 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.316895 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-central-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.316914 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-central-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.316931 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-notification-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.316939 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-notification-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.316973 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="sg-core" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.316981 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="sg-core" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.317002 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="proxy-httpd" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.317009 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="proxy-httpd" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.317276 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="proxy-httpd" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.317306 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-central-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.317318 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="sg-core" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.317341 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" containerName="ceilometer-notification-agent" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.319657 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.325741 4901 scope.go:117] "RemoveContainer" containerID="9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.325940 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.326290 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.326598 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.337337 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.359834 4901 scope.go:117] "RemoveContainer" containerID="886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.382323 4901 scope.go:117] "RemoveContainer" containerID="20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.382890 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778\": container with ID starting with 20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778 not found: ID does not exist" containerID="20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.382946 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778"} err="failed to get container status \"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778\": rpc error: code = NotFound desc = could not find container \"20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778\": container with ID starting with 20248b62952ce61096e356fe301f645e3907350faadaeb442d1b5a275bcc5778 not found: ID does not exist" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.382975 4901 scope.go:117] "RemoveContainer" containerID="b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.383330 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd\": container with ID starting with b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd not found: ID does not exist" containerID="b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.383355 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd"} err="failed to get container status \"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd\": rpc error: code = NotFound desc = could not find container \"b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd\": container with ID starting with b8c9b9f60a3ed74256dc7356d2cd948b416caa009181ebedb62391c03cd127bd not found: ID does not exist" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.383368 4901 scope.go:117] "RemoveContainer" containerID="9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.383576 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce\": container with ID starting with 9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce not found: ID does not exist" containerID="9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.383600 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce"} err="failed to get container status \"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce\": rpc error: code = NotFound desc = could not find container \"9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce\": container with ID starting with 9132fc15a1f5f25c06a7af5e0d584903a35a1a3103f4534782d25d59ebfd52ce not found: ID does not exist" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.383614 4901 scope.go:117] "RemoveContainer" containerID="886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b" Feb 02 10:58:50 crc kubenswrapper[4901]: E0202 10:58:50.383797 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b\": container with ID starting with 886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b not found: ID does not exist" containerID="886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.383823 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b"} err="failed to get container status \"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b\": rpc error: code = NotFound desc = could not find container \"886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b\": container with ID starting with 886da255e8b6a73f747ab2cd28e7758e6839557ec3753c9b84d80ad75f5d8a3b not found: ID does not exist" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.435346 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.435653 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.435940 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72cn\" (UniqueName: \"kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.436076 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.436218 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.437037 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.437137 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.437505 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540117 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540197 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540257 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540295 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540504 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.540554 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.541293 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.541378 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r72cn\" (UniqueName: \"kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.542050 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.542070 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.545076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.545302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.545713 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.546957 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.549299 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.562677 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72cn\" (UniqueName: \"kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn\") pod \"ceilometer-0\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " pod="openstack/ceilometer-0" Feb 02 10:58:50 crc kubenswrapper[4901]: I0202 10:58:50.642595 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:58:51 crc kubenswrapper[4901]: I0202 10:58:51.137388 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:58:51 crc kubenswrapper[4901]: I0202 10:58:51.261369 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerStarted","Data":"b69bbba24d50549490df2c0ea24cf800ace0278f2d00b96ea99ede0fcd8307d6"} Feb 02 10:58:51 crc kubenswrapper[4901]: I0202 10:58:51.654827 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:58:51 crc kubenswrapper[4901]: I0202 10:58:51.721881 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c8aa9b-cedf-433f-8e40-5156c9fe53a2" path="/var/lib/kubelet/pods/52c8aa9b-cedf-433f-8e40-5156c9fe53a2/volumes" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.004505 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.088033 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data\") pod \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.088111 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdfk5\" (UniqueName: \"kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5\") pod \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.088176 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle\") pod \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\" (UID: \"bab42463-80dc-4ab5-8af6-83ef6bc0bb43\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.104475 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5" (OuterVolumeSpecName: "kube-api-access-zdfk5") pod "bab42463-80dc-4ab5-8af6-83ef6bc0bb43" (UID: "bab42463-80dc-4ab5-8af6-83ef6bc0bb43"). InnerVolumeSpecName "kube-api-access-zdfk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.120623 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data" (OuterVolumeSpecName: "config-data") pod "bab42463-80dc-4ab5-8af6-83ef6bc0bb43" (UID: "bab42463-80dc-4ab5-8af6-83ef6bc0bb43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.159672 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bab42463-80dc-4ab5-8af6-83ef6bc0bb43" (UID: "bab42463-80dc-4ab5-8af6-83ef6bc0bb43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.190800 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.190836 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdfk5\" (UniqueName: \"kubernetes.io/projected/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-kube-api-access-zdfk5\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.190848 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab42463-80dc-4ab5-8af6-83ef6bc0bb43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.255878 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.279428 4901 generic.go:334] "Generic (PLEG): container finished" podID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerID="c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632" exitCode=0 Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.279504 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerDied","Data":"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632"} Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.279534 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12bc7db9-7d12-48e2-b452-6d15334cd44d","Type":"ContainerDied","Data":"cfad305202aba39c458bc5f173200871836afd2e0357ae8c4eb2b8534b5852a7"} Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.279555 4901 scope.go:117] "RemoveContainer" containerID="c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.279715 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.284234 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerStarted","Data":"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5"} Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.285710 4901 generic.go:334] "Generic (PLEG): container finished" podID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" exitCode=0 Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.285739 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab42463-80dc-4ab5-8af6-83ef6bc0bb43","Type":"ContainerDied","Data":"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965"} Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.285757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab42463-80dc-4ab5-8af6-83ef6bc0bb43","Type":"ContainerDied","Data":"c29afb1cdff6569cc02831a2a9dccc6c8c9bba785c65cd95ff8b88f83c15014b"} Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.285757 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.321632 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.326676 4901 scope.go:117] "RemoveContainer" containerID="6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.342668 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.342736 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.343218 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-log" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343235 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-log" Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.343253 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-api" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343264 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-api" Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.343277 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerName="nova-scheduler-scheduler" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343298 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerName="nova-scheduler-scheduler" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343532 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" containerName="nova-scheduler-scheduler" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343579 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-api" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.343594 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" containerName="nova-api-log" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.344264 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.353511 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.362734 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.369507 4901 scope.go:117] "RemoveContainer" containerID="c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632" Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.375304 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632\": container with ID starting with c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632 not found: ID does not exist" containerID="c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.375373 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632"} err="failed to get container status \"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632\": rpc error: code = NotFound desc = could not find container \"c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632\": container with ID starting with c6b4f241bfe27a5294803bce75dca0ea160dd0f9e61c7319f6b26e3fa5bdf632 not found: ID does not exist" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.375405 4901 scope.go:117] "RemoveContainer" containerID="6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4" Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.376006 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4\": container with ID starting with 6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4 not found: ID does not exist" containerID="6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.376080 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4"} err="failed to get container status \"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4\": rpc error: code = NotFound desc = could not find container \"6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4\": container with ID starting with 6f88a25d90668bb0e82d62488d0e8b843bfcb93e5e2e53d14fe811d01a98fca4 not found: ID does not exist" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.376118 4901 scope.go:117] "RemoveContainer" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.394671 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data\") pod \"12bc7db9-7d12-48e2-b452-6d15334cd44d\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.394714 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsxz\" (UniqueName: \"kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz\") pod \"12bc7db9-7d12-48e2-b452-6d15334cd44d\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.394742 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs\") pod \"12bc7db9-7d12-48e2-b452-6d15334cd44d\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.394979 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle\") pod \"12bc7db9-7d12-48e2-b452-6d15334cd44d\" (UID: \"12bc7db9-7d12-48e2-b452-6d15334cd44d\") " Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.395355 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.395424 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.395757 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw28p\" (UniqueName: \"kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.395954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs" (OuterVolumeSpecName: "logs") pod "12bc7db9-7d12-48e2-b452-6d15334cd44d" (UID: "12bc7db9-7d12-48e2-b452-6d15334cd44d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.396758 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12bc7db9-7d12-48e2-b452-6d15334cd44d-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.400634 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz" (OuterVolumeSpecName: "kube-api-access-9wsxz") pod "12bc7db9-7d12-48e2-b452-6d15334cd44d" (UID: "12bc7db9-7d12-48e2-b452-6d15334cd44d"). InnerVolumeSpecName "kube-api-access-9wsxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.411533 4901 scope.go:117] "RemoveContainer" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" Feb 02 10:58:52 crc kubenswrapper[4901]: E0202 10:58:52.412224 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965\": container with ID starting with 852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965 not found: ID does not exist" containerID="852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.412302 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965"} err="failed to get container status \"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965\": rpc error: code = NotFound desc = could not find container \"852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965\": container with ID starting with 852f8d30c831a388279063e8dd2ae055c8ae05d97ccaed50898d0d2a68402965 not found: ID does not exist" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.421442 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12bc7db9-7d12-48e2-b452-6d15334cd44d" (UID: "12bc7db9-7d12-48e2-b452-6d15334cd44d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.421956 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data" (OuterVolumeSpecName: "config-data") pod "12bc7db9-7d12-48e2-b452-6d15334cd44d" (UID: "12bc7db9-7d12-48e2-b452-6d15334cd44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502443 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502552 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw28p\" (UniqueName: \"kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502682 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502757 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502769 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12bc7db9-7d12-48e2-b452-6d15334cd44d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.502781 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsxz\" (UniqueName: \"kubernetes.io/projected/12bc7db9-7d12-48e2-b452-6d15334cd44d-kube-api-access-9wsxz\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.507971 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.508269 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.519553 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw28p\" (UniqueName: \"kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p\") pod \"nova-scheduler-0\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.600270 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.606657 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.623095 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.640094 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.654782 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.657951 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.667098 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.671435 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.691474 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.709032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.709099 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.709164 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbxxp\" (UniqueName: \"kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.709212 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.811723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.812274 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.812334 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbxxp\" (UniqueName: \"kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.812391 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.813390 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.820500 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.835047 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.838865 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbxxp\" (UniqueName: \"kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp\") pod \"nova-api-0\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " pod="openstack/nova-api-0" Feb 02 10:58:52 crc kubenswrapper[4901]: I0202 10:58:52.984199 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.165201 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.298953 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d15234c6-a14d-46ac-8cc6-d41b50d18589","Type":"ContainerStarted","Data":"4b96f89c183fbf97cbda15fde818aa75132e00a38172a554b1cf73221bc49f72"} Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.305894 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerStarted","Data":"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4"} Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.526440 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:58:53 crc kubenswrapper[4901]: W0202 10:58:53.531641 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd26f81f8_fdb8_492f_b692_52404df314ec.slice/crio-a95cf366e5839804dabf598425fcaccee520ebfff51df12b322530057524c8d9 WatchSource:0}: Error finding container a95cf366e5839804dabf598425fcaccee520ebfff51df12b322530057524c8d9: Status 404 returned error can't find the container with id a95cf366e5839804dabf598425fcaccee520ebfff51df12b322530057524c8d9 Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.689955 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bc7db9-7d12-48e2-b452-6d15334cd44d" path="/var/lib/kubelet/pods/12bc7db9-7d12-48e2-b452-6d15334cd44d/volumes" Feb 02 10:58:53 crc kubenswrapper[4901]: I0202 10:58:53.690811 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab42463-80dc-4ab5-8af6-83ef6bc0bb43" path="/var/lib/kubelet/pods/bab42463-80dc-4ab5-8af6-83ef6bc0bb43/volumes" Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.322430 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d15234c6-a14d-46ac-8cc6-d41b50d18589","Type":"ContainerStarted","Data":"a992aa592b005f987cb023606d6ef03dc8a09c6d9931f48b50eef1b22ae07026"} Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.325095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerStarted","Data":"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9"} Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.325190 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerStarted","Data":"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5"} Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.325217 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerStarted","Data":"a95cf366e5839804dabf598425fcaccee520ebfff51df12b322530057524c8d9"} Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.328303 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerStarted","Data":"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e"} Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.362497 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.362477588 podStartE2EDuration="2.362477588s" podCreationTimestamp="2026-02-02 10:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:54.345021278 +0000 UTC m=+1221.363361374" watchObservedRunningTime="2026-02-02 10:58:54.362477588 +0000 UTC m=+1221.380817684" Feb 02 10:58:54 crc kubenswrapper[4901]: I0202 10:58:54.379286 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.37926332 podStartE2EDuration="2.37926332s" podCreationTimestamp="2026-02-02 10:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:58:54.369070889 +0000 UTC m=+1221.387411075" watchObservedRunningTime="2026-02-02 10:58:54.37926332 +0000 UTC m=+1221.397603416" Feb 02 10:58:56 crc kubenswrapper[4901]: I0202 10:58:56.349651 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerStarted","Data":"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1"} Feb 02 10:58:56 crc kubenswrapper[4901]: I0202 10:58:56.350211 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:58:56 crc kubenswrapper[4901]: I0202 10:58:56.384189 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141285067 podStartE2EDuration="6.384169352s" podCreationTimestamp="2026-02-02 10:58:50 +0000 UTC" firstStartedPulling="2026-02-02 10:58:51.127726418 +0000 UTC m=+1218.146066524" lastFinishedPulling="2026-02-02 10:58:55.370610713 +0000 UTC m=+1222.388950809" observedRunningTime="2026-02-02 10:58:56.375289394 +0000 UTC m=+1223.393629490" watchObservedRunningTime="2026-02-02 10:58:56.384169352 +0000 UTC m=+1223.402509448" Feb 02 10:58:57 crc kubenswrapper[4901]: I0202 10:58:57.599320 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:58:57 crc kubenswrapper[4901]: I0202 10:58:57.599602 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:58:57 crc kubenswrapper[4901]: I0202 10:58:57.656010 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 10:58:57 crc kubenswrapper[4901]: I0202 10:58:57.671964 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:58:58 crc kubenswrapper[4901]: I0202 10:58:58.613753 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:58:58 crc kubenswrapper[4901]: I0202 10:58:58.613765 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:02 crc kubenswrapper[4901]: I0202 10:59:02.672547 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:59:02 crc kubenswrapper[4901]: I0202 10:59:02.702241 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:59:02 crc kubenswrapper[4901]: I0202 10:59:02.985802 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:02 crc kubenswrapper[4901]: I0202 10:59:02.985867 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:03 crc kubenswrapper[4901]: I0202 10:59:03.450340 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:59:04 crc kubenswrapper[4901]: I0202 10:59:04.069765 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:04 crc kubenswrapper[4901]: I0202 10:59:04.069789 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.606720 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.607152 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.612289 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.613132 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.837360 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:59:07 crc kubenswrapper[4901]: I0202 10:59:07.837420 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.507776 4901 generic.go:334] "Generic (PLEG): container finished" podID="2700af84-ba53-4dbe-970b-13c8f398461e" containerID="05912d87d9bd02f1cc666a84703c2c5e0534df3ce47a1a7c4fdcccab3e79f089" exitCode=137 Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.507907 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2700af84-ba53-4dbe-970b-13c8f398461e","Type":"ContainerDied","Data":"05912d87d9bd02f1cc666a84703c2c5e0534df3ce47a1a7c4fdcccab3e79f089"} Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.508334 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2700af84-ba53-4dbe-970b-13c8f398461e","Type":"ContainerDied","Data":"e13b828bf74021888929822d188ba1348f183d62b4110c90b6c59662ec6dd0f0"} Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.508355 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13b828bf74021888929822d188ba1348f183d62b4110c90b6c59662ec6dd0f0" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.520658 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.604741 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dgd\" (UniqueName: \"kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd\") pod \"2700af84-ba53-4dbe-970b-13c8f398461e\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.604871 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle\") pod \"2700af84-ba53-4dbe-970b-13c8f398461e\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.604935 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data\") pod \"2700af84-ba53-4dbe-970b-13c8f398461e\" (UID: \"2700af84-ba53-4dbe-970b-13c8f398461e\") " Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.612714 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd" (OuterVolumeSpecName: "kube-api-access-d4dgd") pod "2700af84-ba53-4dbe-970b-13c8f398461e" (UID: "2700af84-ba53-4dbe-970b-13c8f398461e"). InnerVolumeSpecName "kube-api-access-d4dgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.639698 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data" (OuterVolumeSpecName: "config-data") pod "2700af84-ba53-4dbe-970b-13c8f398461e" (UID: "2700af84-ba53-4dbe-970b-13c8f398461e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.642113 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2700af84-ba53-4dbe-970b-13c8f398461e" (UID: "2700af84-ba53-4dbe-970b-13c8f398461e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.708364 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dgd\" (UniqueName: \"kubernetes.io/projected/2700af84-ba53-4dbe-970b-13c8f398461e-kube-api-access-d4dgd\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.708397 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:10 crc kubenswrapper[4901]: I0202 10:59:10.708407 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2700af84-ba53-4dbe-970b-13c8f398461e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.519755 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.588223 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.602016 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.617771 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:59:11 crc kubenswrapper[4901]: E0202 10:59:11.618422 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2700af84-ba53-4dbe-970b-13c8f398461e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.618446 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2700af84-ba53-4dbe-970b-13c8f398461e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.618727 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2700af84-ba53-4dbe-970b-13c8f398461e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.619664 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.626065 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.626300 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.626762 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.633328 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.694209 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2700af84-ba53-4dbe-970b-13c8f398461e" path="/var/lib/kubelet/pods/2700af84-ba53-4dbe-970b-13c8f398461e/volumes" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.730414 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.730756 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.731923 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.732129 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.732171 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5cww\" (UniqueName: \"kubernetes.io/projected/24d88cb1-d10e-4ade-91f2-48d3ed40f873-kube-api-access-d5cww\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.834031 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.834148 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.834239 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.834285 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.834311 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5cww\" (UniqueName: \"kubernetes.io/projected/24d88cb1-d10e-4ade-91f2-48d3ed40f873-kube-api-access-d5cww\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.841280 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.842463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.846162 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.846282 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24d88cb1-d10e-4ade-91f2-48d3ed40f873-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.855795 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5cww\" (UniqueName: \"kubernetes.io/projected/24d88cb1-d10e-4ade-91f2-48d3ed40f873-kube-api-access-d5cww\") pod \"nova-cell1-novncproxy-0\" (UID: \"24d88cb1-d10e-4ade-91f2-48d3ed40f873\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:11 crc kubenswrapper[4901]: I0202 10:59:11.946536 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.436756 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:59:12 crc kubenswrapper[4901]: W0202 10:59:12.441450 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24d88cb1_d10e_4ade_91f2_48d3ed40f873.slice/crio-8552cbace3e4c12214fd3713040a1390f3736f18f1808a13a3fcf1c5a2c9bb0d WatchSource:0}: Error finding container 8552cbace3e4c12214fd3713040a1390f3736f18f1808a13a3fcf1c5a2c9bb0d: Status 404 returned error can't find the container with id 8552cbace3e4c12214fd3713040a1390f3736f18f1808a13a3fcf1c5a2c9bb0d Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.538360 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"24d88cb1-d10e-4ade-91f2-48d3ed40f873","Type":"ContainerStarted","Data":"8552cbace3e4c12214fd3713040a1390f3736f18f1808a13a3fcf1c5a2c9bb0d"} Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.989554 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.990010 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.990239 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.990444 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.992398 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:59:12 crc kubenswrapper[4901]: I0202 10:59:12.992955 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.247036 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.249060 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.264928 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.372774 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.372939 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.372973 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.372995 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvw7\" (UniqueName: \"kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.373279 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.373637 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475432 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475537 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475617 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475661 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.475712 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvw7\" (UniqueName: \"kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.476642 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.476693 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.476745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.476883 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.477439 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.497849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvw7\" (UniqueName: \"kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7\") pod \"dnsmasq-dns-f84f9ccf-8p6jl\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.578111 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.587388 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"24d88cb1-d10e-4ade-91f2-48d3ed40f873","Type":"ContainerStarted","Data":"0c84390bfe588ac52bb22a4bf0baa87abb3fb93f25816b50b787651edfa5831c"} Feb 02 10:59:13 crc kubenswrapper[4901]: I0202 10:59:13.605177 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.605153422 podStartE2EDuration="2.605153422s" podCreationTimestamp="2026-02-02 10:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:13.604313211 +0000 UTC m=+1240.622653307" watchObservedRunningTime="2026-02-02 10:59:13.605153422 +0000 UTC m=+1240.623493518" Feb 02 10:59:14 crc kubenswrapper[4901]: I0202 10:59:14.096661 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 10:59:14 crc kubenswrapper[4901]: W0202 10:59:14.096783 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b18aeb4_f6a2_4bd0_8fa0_1cba2f986c66.slice/crio-e3e510a2b500976500995a9655a09783a538321ef031da3117ba8051ea39ea46 WatchSource:0}: Error finding container e3e510a2b500976500995a9655a09783a538321ef031da3117ba8051ea39ea46: Status 404 returned error can't find the container with id e3e510a2b500976500995a9655a09783a538321ef031da3117ba8051ea39ea46 Feb 02 10:59:14 crc kubenswrapper[4901]: I0202 10:59:14.595640 4901 generic.go:334] "Generic (PLEG): container finished" podID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerID="69fd824466b8f659e0de1a9dffceb3cceab63672a236a5fdbaede31271bc812a" exitCode=0 Feb 02 10:59:14 crc kubenswrapper[4901]: I0202 10:59:14.595836 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" event={"ID":"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66","Type":"ContainerDied","Data":"69fd824466b8f659e0de1a9dffceb3cceab63672a236a5fdbaede31271bc812a"} Feb 02 10:59:14 crc kubenswrapper[4901]: I0202 10:59:14.596898 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" event={"ID":"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66","Type":"ContainerStarted","Data":"e3e510a2b500976500995a9655a09783a538321ef031da3117ba8051ea39ea46"} Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.330416 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.331227 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-central-agent" containerID="cri-o://4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5" gracePeriod=30 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.331353 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="proxy-httpd" containerID="cri-o://c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1" gracePeriod=30 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.331401 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="sg-core" containerID="cri-o://cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e" gracePeriod=30 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.331442 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-notification-agent" containerID="cri-o://c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4" gracePeriod=30 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.343265 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.208:3000/\": EOF" Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.610590 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" event={"ID":"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66","Type":"ContainerStarted","Data":"a1d82a38a53c6876ac9b06cdddf10117c78cf4b093c7c0f3e06ab4e6613680a2"} Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.611086 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.613395 4901 generic.go:334] "Generic (PLEG): container finished" podID="b62a85e7-9039-4129-9206-bd29fa676593" containerID="c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1" exitCode=0 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.613418 4901 generic.go:334] "Generic (PLEG): container finished" podID="b62a85e7-9039-4129-9206-bd29fa676593" containerID="cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e" exitCode=2 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.613435 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerDied","Data":"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1"} Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.613457 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerDied","Data":"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e"} Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.646006 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" podStartSLOduration=2.6459833980000003 podStartE2EDuration="2.645983398s" podCreationTimestamp="2026-02-02 10:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:15.643116636 +0000 UTC m=+1242.661456762" watchObservedRunningTime="2026-02-02 10:59:15.645983398 +0000 UTC m=+1242.664323494" Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.821099 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.821778 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-api" containerID="cri-o://9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9" gracePeriod=30 Feb 02 10:59:15 crc kubenswrapper[4901]: I0202 10:59:15.821648 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-log" containerID="cri-o://457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5" gracePeriod=30 Feb 02 10:59:16 crc kubenswrapper[4901]: I0202 10:59:16.624365 4901 generic.go:334] "Generic (PLEG): container finished" podID="d26f81f8-fdb8-492f-b692-52404df314ec" containerID="457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5" exitCode=143 Feb 02 10:59:16 crc kubenswrapper[4901]: I0202 10:59:16.624487 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerDied","Data":"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5"} Feb 02 10:59:16 crc kubenswrapper[4901]: I0202 10:59:16.627721 4901 generic.go:334] "Generic (PLEG): container finished" podID="b62a85e7-9039-4129-9206-bd29fa676593" containerID="4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5" exitCode=0 Feb 02 10:59:16 crc kubenswrapper[4901]: I0202 10:59:16.627771 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerDied","Data":"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5"} Feb 02 10:59:16 crc kubenswrapper[4901]: I0202 10:59:16.947108 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.206296 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361085 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361190 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361269 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r72cn\" (UniqueName: \"kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361317 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361359 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361446 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361509 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.361540 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data\") pod \"b62a85e7-9039-4129-9206-bd29fa676593\" (UID: \"b62a85e7-9039-4129-9206-bd29fa676593\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.362014 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.362244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.368682 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn" (OuterVolumeSpecName: "kube-api-access-r72cn") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "kube-api-access-r72cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.381532 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts" (OuterVolumeSpecName: "scripts") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.402940 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.434997 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.455241 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463458 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463491 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r72cn\" (UniqueName: \"kubernetes.io/projected/b62a85e7-9039-4129-9206-bd29fa676593-kube-api-access-r72cn\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463505 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463520 4901 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463531 4901 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.463546 4901 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b62a85e7-9039-4129-9206-bd29fa676593-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.464311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.495938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data" (OuterVolumeSpecName: "config-data") pod "b62a85e7-9039-4129-9206-bd29fa676593" (UID: "b62a85e7-9039-4129-9206-bd29fa676593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.564459 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data\") pod \"d26f81f8-fdb8-492f-b692-52404df314ec\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.564650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs\") pod \"d26f81f8-fdb8-492f-b692-52404df314ec\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.564702 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbxxp\" (UniqueName: \"kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp\") pod \"d26f81f8-fdb8-492f-b692-52404df314ec\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.564731 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle\") pod \"d26f81f8-fdb8-492f-b692-52404df314ec\" (UID: \"d26f81f8-fdb8-492f-b692-52404df314ec\") " Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.565100 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs" (OuterVolumeSpecName: "logs") pod "d26f81f8-fdb8-492f-b692-52404df314ec" (UID: "d26f81f8-fdb8-492f-b692-52404df314ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.565169 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.565188 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62a85e7-9039-4129-9206-bd29fa676593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.568586 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp" (OuterVolumeSpecName: "kube-api-access-cbxxp") pod "d26f81f8-fdb8-492f-b692-52404df314ec" (UID: "d26f81f8-fdb8-492f-b692-52404df314ec"). InnerVolumeSpecName "kube-api-access-cbxxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.596914 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d26f81f8-fdb8-492f-b692-52404df314ec" (UID: "d26f81f8-fdb8-492f-b692-52404df314ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.598805 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data" (OuterVolumeSpecName: "config-data") pod "d26f81f8-fdb8-492f-b692-52404df314ec" (UID: "d26f81f8-fdb8-492f-b692-52404df314ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.657891 4901 generic.go:334] "Generic (PLEG): container finished" podID="b62a85e7-9039-4129-9206-bd29fa676593" containerID="c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4" exitCode=0 Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.657954 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerDied","Data":"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4"} Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.657995 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b62a85e7-9039-4129-9206-bd29fa676593","Type":"ContainerDied","Data":"b69bbba24d50549490df2c0ea24cf800ace0278f2d00b96ea99ede0fcd8307d6"} Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.657994 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.658014 4901 scope.go:117] "RemoveContainer" containerID="c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.660315 4901 generic.go:334] "Generic (PLEG): container finished" podID="d26f81f8-fdb8-492f-b692-52404df314ec" containerID="9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9" exitCode=0 Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.660355 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerDied","Data":"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9"} Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.660388 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.660408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d26f81f8-fdb8-492f-b692-52404df314ec","Type":"ContainerDied","Data":"a95cf366e5839804dabf598425fcaccee520ebfff51df12b322530057524c8d9"} Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.666414 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d26f81f8-fdb8-492f-b692-52404df314ec-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.666438 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbxxp\" (UniqueName: \"kubernetes.io/projected/d26f81f8-fdb8-492f-b692-52404df314ec-kube-api-access-cbxxp\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.666449 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.666458 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d26f81f8-fdb8-492f-b692-52404df314ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.680947 4901 scope.go:117] "RemoveContainer" containerID="cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.702097 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.712360 4901 scope.go:117] "RemoveContainer" containerID="c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.718254 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.739258 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.742834 4901 scope.go:117] "RemoveContainer" containerID="4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.749804 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.759494 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760194 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-api" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760217 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-api" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760237 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-central-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760244 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-central-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760264 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="sg-core" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760284 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="sg-core" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760293 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-notification-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760301 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-notification-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760314 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-log" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760320 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-log" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.760333 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="proxy-httpd" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760339 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="proxy-httpd" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760517 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-api" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760528 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" containerName="nova-api-log" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760539 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-central-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760549 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="ceilometer-notification-agent" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760586 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="proxy-httpd" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.760600 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62a85e7-9039-4129-9206-bd29fa676593" containerName="sg-core" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.762680 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.765036 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.765221 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.765416 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.772850 4901 scope.go:117] "RemoveContainer" containerID="c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.772967 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.775132 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1\": container with ID starting with c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1 not found: ID does not exist" containerID="c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.775166 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1"} err="failed to get container status \"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1\": rpc error: code = NotFound desc = could not find container \"c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1\": container with ID starting with c74f66946090cebe8b9a5f0e78b176b3ff59666fa0d01cf7b1e3b88de4d7c7d1 not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.775191 4901 scope.go:117] "RemoveContainer" containerID="cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.775826 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.776406 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e\": container with ID starting with cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e not found: ID does not exist" containerID="cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.776433 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e"} err="failed to get container status \"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e\": rpc error: code = NotFound desc = could not find container \"cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e\": container with ID starting with cfd206826f515d93b3a1b8aa85f7d2361746d202a6de6fffa7a564807b09557e not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.776447 4901 scope.go:117] "RemoveContainer" containerID="c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.778446 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.778600 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.778707 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.778821 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4\": container with ID starting with c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4 not found: ID does not exist" containerID="c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.778839 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4"} err="failed to get container status \"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4\": rpc error: code = NotFound desc = could not find container \"c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4\": container with ID starting with c4967bb6c487c173b8622da01d79622af7ea7040e44dd4fcad5652f4c52dd5d4 not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.778852 4901 scope.go:117] "RemoveContainer" containerID="4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.779078 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5\": container with ID starting with 4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5 not found: ID does not exist" containerID="4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.779092 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5"} err="failed to get container status \"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5\": rpc error: code = NotFound desc = could not find container \"4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5\": container with ID starting with 4998162e8592344ae0da57ef238e10726fc3d65e8f3f9407a44e3b8181c3ddf5 not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.779103 4901 scope.go:117] "RemoveContainer" containerID="9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.785928 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.794263 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.846460 4901 scope.go:117] "RemoveContainer" containerID="457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871410 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbqf\" (UniqueName: \"kubernetes.io/projected/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-kube-api-access-gsbqf\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871495 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871539 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871584 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871656 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871677 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871730 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-scripts\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871764 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpxxv\" (UniqueName: \"kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871790 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871815 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-config-data\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-run-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871924 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.871974 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-log-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.878276 4901 scope.go:117] "RemoveContainer" containerID="9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.879060 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9\": container with ID starting with 9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9 not found: ID does not exist" containerID="9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.879105 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9"} err="failed to get container status \"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9\": rpc error: code = NotFound desc = could not find container \"9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9\": container with ID starting with 9637badc9b1ea0eae14da6646b53c6d4785258b8e2c62916c9eff0a0bb0521a9 not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.879133 4901 scope.go:117] "RemoveContainer" containerID="457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5" Feb 02 10:59:19 crc kubenswrapper[4901]: E0202 10:59:19.879708 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5\": container with ID starting with 457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5 not found: ID does not exist" containerID="457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.879741 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5"} err="failed to get container status \"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5\": rpc error: code = NotFound desc = could not find container \"457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5\": container with ID starting with 457df3a655920aba5010d7d517be1313c09d9a66c66929f55012617b1d98eed5 not found: ID does not exist" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpxxv\" (UniqueName: \"kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973746 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973767 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973808 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-config-data\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973829 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-run-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973864 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973887 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-log-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.973930 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbqf\" (UniqueName: \"kubernetes.io/projected/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-kube-api-access-gsbqf\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974536 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-run-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974690 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-log-httpd\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974873 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974929 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.974995 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.975346 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.975372 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.975413 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-scripts\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.983200 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-scripts\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.985194 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.985462 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-config-data\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.990626 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.994629 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.995256 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.995863 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.997840 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.998398 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbqf\" (UniqueName: \"kubernetes.io/projected/2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b-kube-api-access-gsbqf\") pod \"ceilometer-0\" (UID: \"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b\") " pod="openstack/ceilometer-0" Feb 02 10:59:19 crc kubenswrapper[4901]: I0202 10:59:19.999952 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.002645 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpxxv\" (UniqueName: \"kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv\") pod \"nova-api-0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " pod="openstack/nova-api-0" Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.143773 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.153747 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.636525 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.638196 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.678106 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b","Type":"ContainerStarted","Data":"8974e6ba0844ce6a22395256cf77146c74a6548729821291a01f19f6a4bcbfad"} Feb 02 10:59:20 crc kubenswrapper[4901]: W0202 10:59:20.746236 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8eb5214_26b8_472b_be31_2604d1f0a7a0.slice/crio-5f74f7348c9557ddbbafe8f8460e4c009ce19e99dc861821124f910dbb04582c WatchSource:0}: Error finding container 5f74f7348c9557ddbbafe8f8460e4c009ce19e99dc861821124f910dbb04582c: Status 404 returned error can't find the container with id 5f74f7348c9557ddbbafe8f8460e4c009ce19e99dc861821124f910dbb04582c Feb 02 10:59:20 crc kubenswrapper[4901]: I0202 10:59:20.757503 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.694802 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62a85e7-9039-4129-9206-bd29fa676593" path="/var/lib/kubelet/pods/b62a85e7-9039-4129-9206-bd29fa676593/volumes" Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.696713 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26f81f8-fdb8-492f-b692-52404df314ec" path="/var/lib/kubelet/pods/d26f81f8-fdb8-492f-b692-52404df314ec/volumes" Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.697559 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b","Type":"ContainerStarted","Data":"5c689cd749412cd731081351601f8d5947a5766ec04e128d04a485f0301e6ea1"} Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.697709 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerStarted","Data":"31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9"} Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.697767 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerStarted","Data":"6a312e1ca26e3585548a8702c20996b52c437d7f455f3cd2e8431b7e2c21ebb9"} Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.697782 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerStarted","Data":"5f74f7348c9557ddbbafe8f8460e4c009ce19e99dc861821124f910dbb04582c"} Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.948501 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:21 crc kubenswrapper[4901]: I0202 10:59:21.972299 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.006408 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.006374831 podStartE2EDuration="3.006374831s" podCreationTimestamp="2026-02-02 10:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:21.715943511 +0000 UTC m=+1248.734283617" watchObservedRunningTime="2026-02-02 10:59:22.006374831 +0000 UTC m=+1249.024714937" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.715246 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b","Type":"ContainerStarted","Data":"c366649931fa18535bad027e409a7bbfe4ca3e7d7c638099971e9a81dd674d67"} Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.735477 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.939045 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-grf2k"] Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.940874 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.945843 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.946055 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 02 10:59:22 crc kubenswrapper[4901]: I0202 10:59:22.953303 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-grf2k"] Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.061353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.061424 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.061890 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcws4\" (UniqueName: \"kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.062100 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.164946 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.165045 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.165102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.165218 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcws4\" (UniqueName: \"kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.174431 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.174755 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.183361 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.184849 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcws4\" (UniqueName: \"kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4\") pod \"nova-cell1-cell-mapping-grf2k\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.261869 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.580777 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.661264 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.661632 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="dnsmasq-dns" containerID="cri-o://6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2" gracePeriod=10 Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.746862 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b","Type":"ContainerStarted","Data":"19ab3ccde9ea3fac6a1a3cdef5bf41a73cebc3cc88b27a9abf8145f01d5e51b0"} Feb 02 10:59:23 crc kubenswrapper[4901]: W0202 10:59:23.794481 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a2b10de_90f9_49b1_aa3e_92508a93e8ee.slice/crio-32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea WatchSource:0}: Error finding container 32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea: Status 404 returned error can't find the container with id 32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea Feb 02 10:59:23 crc kubenswrapper[4901]: I0202 10:59:23.802781 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-grf2k"] Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.182497 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.310398 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.310632 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.310735 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.310824 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.311056 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wszll\" (UniqueName: \"kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.311186 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb\") pod \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\" (UID: \"6a9df7ea-8e3d-4c6e-a86d-793168703b18\") " Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.324256 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll" (OuterVolumeSpecName: "kube-api-access-wszll") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "kube-api-access-wszll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.374164 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.377766 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.385262 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config" (OuterVolumeSpecName: "config") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.392216 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.393549 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a9df7ea-8e3d-4c6e-a86d-793168703b18" (UID: "6a9df7ea-8e3d-4c6e-a86d-793168703b18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413497 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413538 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413550 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413580 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413593 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a9df7ea-8e3d-4c6e-a86d-793168703b18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.413605 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wszll\" (UniqueName: \"kubernetes.io/projected/6a9df7ea-8e3d-4c6e-a86d-793168703b18-kube-api-access-wszll\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.759548 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grf2k" event={"ID":"9a2b10de-90f9-49b1-aa3e-92508a93e8ee","Type":"ContainerStarted","Data":"189cac09a9b7419b7ab679faec956a27c371dbc1f64303a8bf54089b023f82ad"} Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.760003 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grf2k" event={"ID":"9a2b10de-90f9-49b1-aa3e-92508a93e8ee","Type":"ContainerStarted","Data":"32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea"} Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.764660 4901 generic.go:334] "Generic (PLEG): container finished" podID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerID="6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2" exitCode=0 Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.764718 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" event={"ID":"6a9df7ea-8e3d-4c6e-a86d-793168703b18","Type":"ContainerDied","Data":"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2"} Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.764767 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.764785 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568d7fd7cf-7x2x6" event={"ID":"6a9df7ea-8e3d-4c6e-a86d-793168703b18","Type":"ContainerDied","Data":"41cd1989f4b53139f24917d8f70419b8e0147613619f056bf0b83bf0bcbc6bdf"} Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.764803 4901 scope.go:117] "RemoveContainer" containerID="6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.785144 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-grf2k" podStartSLOduration=2.785122799 podStartE2EDuration="2.785122799s" podCreationTimestamp="2026-02-02 10:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:24.778106717 +0000 UTC m=+1251.796446823" watchObservedRunningTime="2026-02-02 10:59:24.785122799 +0000 UTC m=+1251.803462895" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.805361 4901 scope.go:117] "RemoveContainer" containerID="0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.816311 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.827811 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568d7fd7cf-7x2x6"] Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.830272 4901 scope.go:117] "RemoveContainer" containerID="6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2" Feb 02 10:59:24 crc kubenswrapper[4901]: E0202 10:59:24.830871 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2\": container with ID starting with 6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2 not found: ID does not exist" containerID="6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.830916 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2"} err="failed to get container status \"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2\": rpc error: code = NotFound desc = could not find container \"6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2\": container with ID starting with 6dc6444cbac5287705ac49e8acdb1327d0b1da8159860944a1d9947205f36de2 not found: ID does not exist" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.830940 4901 scope.go:117] "RemoveContainer" containerID="0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54" Feb 02 10:59:24 crc kubenswrapper[4901]: E0202 10:59:24.831335 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54\": container with ID starting with 0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54 not found: ID does not exist" containerID="0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54" Feb 02 10:59:24 crc kubenswrapper[4901]: I0202 10:59:24.831366 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54"} err="failed to get container status \"0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54\": rpc error: code = NotFound desc = could not find container \"0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54\": container with ID starting with 0fdcab80a1f6ff6d38fe4b10e43d9bbf61ee711fef0404c3942a34e50df64d54 not found: ID does not exist" Feb 02 10:59:25 crc kubenswrapper[4901]: I0202 10:59:25.695140 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" path="/var/lib/kubelet/pods/6a9df7ea-8e3d-4c6e-a86d-793168703b18/volumes" Feb 02 10:59:25 crc kubenswrapper[4901]: I0202 10:59:25.783473 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b","Type":"ContainerStarted","Data":"f62bebad15899c815c185d0d1960e27272ea075e7d2992f531c2ae0e3472ac64"} Feb 02 10:59:25 crc kubenswrapper[4901]: I0202 10:59:25.786278 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:59:25 crc kubenswrapper[4901]: I0202 10:59:25.822549 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.28271113 podStartE2EDuration="6.822523985s" podCreationTimestamp="2026-02-02 10:59:19 +0000 UTC" firstStartedPulling="2026-02-02 10:59:20.637355453 +0000 UTC m=+1247.655695569" lastFinishedPulling="2026-02-02 10:59:25.177168328 +0000 UTC m=+1252.195508424" observedRunningTime="2026-02-02 10:59:25.820440823 +0000 UTC m=+1252.838780969" watchObservedRunningTime="2026-02-02 10:59:25.822523985 +0000 UTC m=+1252.840864091" Feb 02 10:59:28 crc kubenswrapper[4901]: I0202 10:59:28.827739 4901 generic.go:334] "Generic (PLEG): container finished" podID="9a2b10de-90f9-49b1-aa3e-92508a93e8ee" containerID="189cac09a9b7419b7ab679faec956a27c371dbc1f64303a8bf54089b023f82ad" exitCode=0 Feb 02 10:59:28 crc kubenswrapper[4901]: I0202 10:59:28.828824 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grf2k" event={"ID":"9a2b10de-90f9-49b1-aa3e-92508a93e8ee","Type":"ContainerDied","Data":"189cac09a9b7419b7ab679faec956a27c371dbc1f64303a8bf54089b023f82ad"} Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.155163 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.155794 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.315356 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.505070 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data\") pod \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.505215 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle\") pod \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.505253 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts\") pod \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.505286 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcws4\" (UniqueName: \"kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4\") pod \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\" (UID: \"9a2b10de-90f9-49b1-aa3e-92508a93e8ee\") " Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.513758 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4" (OuterVolumeSpecName: "kube-api-access-kcws4") pod "9a2b10de-90f9-49b1-aa3e-92508a93e8ee" (UID: "9a2b10de-90f9-49b1-aa3e-92508a93e8ee"). InnerVolumeSpecName "kube-api-access-kcws4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.515020 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts" (OuterVolumeSpecName: "scripts") pod "9a2b10de-90f9-49b1-aa3e-92508a93e8ee" (UID: "9a2b10de-90f9-49b1-aa3e-92508a93e8ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.547466 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a2b10de-90f9-49b1-aa3e-92508a93e8ee" (UID: "9a2b10de-90f9-49b1-aa3e-92508a93e8ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.566974 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data" (OuterVolumeSpecName: "config-data") pod "9a2b10de-90f9-49b1-aa3e-92508a93e8ee" (UID: "9a2b10de-90f9-49b1-aa3e-92508a93e8ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.607947 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.607992 4901 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.608005 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcws4\" (UniqueName: \"kubernetes.io/projected/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-kube-api-access-kcws4\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.608020 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2b10de-90f9-49b1-aa3e-92508a93e8ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.858900 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-grf2k" event={"ID":"9a2b10de-90f9-49b1-aa3e-92508a93e8ee","Type":"ContainerDied","Data":"32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea"} Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.859143 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32301705cc482d23f9494224cd5741c254f15a6d8ee2cfb5653646afe34b7bea" Feb 02 10:59:30 crc kubenswrapper[4901]: I0202 10:59:30.859019 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-grf2k" Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.132243 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.132455 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d15234c6-a14d-46ac-8cc6-d41b50d18589" containerName="nova-scheduler-scheduler" containerID="cri-o://a992aa592b005f987cb023606d6ef03dc8a09c6d9931f48b50eef1b22ae07026" gracePeriod=30 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.165020 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.165468 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-log" containerID="cri-o://6a312e1ca26e3585548a8702c20996b52c437d7f455f3cd2e8431b7e2c21ebb9" gracePeriod=30 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.166180 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-api" containerID="cri-o://31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9" gracePeriod=30 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.169566 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.169671 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.187288 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.187984 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" containerID="cri-o://5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0" gracePeriod=30 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.188235 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" containerID="cri-o://13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa" gracePeriod=30 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.873889 4901 generic.go:334] "Generic (PLEG): container finished" podID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerID="6a312e1ca26e3585548a8702c20996b52c437d7f455f3cd2e8431b7e2c21ebb9" exitCode=143 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.874098 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerDied","Data":"6a312e1ca26e3585548a8702c20996b52c437d7f455f3cd2e8431b7e2c21ebb9"} Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.876164 4901 generic.go:334] "Generic (PLEG): container finished" podID="d15234c6-a14d-46ac-8cc6-d41b50d18589" containerID="a992aa592b005f987cb023606d6ef03dc8a09c6d9931f48b50eef1b22ae07026" exitCode=0 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.876215 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d15234c6-a14d-46ac-8cc6-d41b50d18589","Type":"ContainerDied","Data":"a992aa592b005f987cb023606d6ef03dc8a09c6d9931f48b50eef1b22ae07026"} Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.884424 4901 generic.go:334] "Generic (PLEG): container finished" podID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerID="5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0" exitCode=143 Feb 02 10:59:31 crc kubenswrapper[4901]: I0202 10:59:31.884463 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerDied","Data":"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0"} Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.106707 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.245650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data\") pod \"d15234c6-a14d-46ac-8cc6-d41b50d18589\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.245741 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle\") pod \"d15234c6-a14d-46ac-8cc6-d41b50d18589\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.245816 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw28p\" (UniqueName: \"kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p\") pod \"d15234c6-a14d-46ac-8cc6-d41b50d18589\" (UID: \"d15234c6-a14d-46ac-8cc6-d41b50d18589\") " Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.253036 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p" (OuterVolumeSpecName: "kube-api-access-gw28p") pod "d15234c6-a14d-46ac-8cc6-d41b50d18589" (UID: "d15234c6-a14d-46ac-8cc6-d41b50d18589"). InnerVolumeSpecName "kube-api-access-gw28p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.280146 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data" (OuterVolumeSpecName: "config-data") pod "d15234c6-a14d-46ac-8cc6-d41b50d18589" (UID: "d15234c6-a14d-46ac-8cc6-d41b50d18589"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.282707 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d15234c6-a14d-46ac-8cc6-d41b50d18589" (UID: "d15234c6-a14d-46ac-8cc6-d41b50d18589"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.348597 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.348629 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15234c6-a14d-46ac-8cc6-d41b50d18589-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.348643 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw28p\" (UniqueName: \"kubernetes.io/projected/d15234c6-a14d-46ac-8cc6-d41b50d18589-kube-api-access-gw28p\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.899970 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d15234c6-a14d-46ac-8cc6-d41b50d18589","Type":"ContainerDied","Data":"4b96f89c183fbf97cbda15fde818aa75132e00a38172a554b1cf73221bc49f72"} Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.900101 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.900362 4901 scope.go:117] "RemoveContainer" containerID="a992aa592b005f987cb023606d6ef03dc8a09c6d9931f48b50eef1b22ae07026" Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.971420 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:32 crc kubenswrapper[4901]: I0202 10:59:32.997212 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.005906 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:33 crc kubenswrapper[4901]: E0202 10:59:33.006361 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="dnsmasq-dns" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.006377 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="dnsmasq-dns" Feb 02 10:59:33 crc kubenswrapper[4901]: E0202 10:59:33.006407 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2b10de-90f9-49b1-aa3e-92508a93e8ee" containerName="nova-manage" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.006417 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2b10de-90f9-49b1-aa3e-92508a93e8ee" containerName="nova-manage" Feb 02 10:59:33 crc kubenswrapper[4901]: E0202 10:59:33.006433 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15234c6-a14d-46ac-8cc6-d41b50d18589" containerName="nova-scheduler-scheduler" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.006442 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15234c6-a14d-46ac-8cc6-d41b50d18589" containerName="nova-scheduler-scheduler" Feb 02 10:59:33 crc kubenswrapper[4901]: E0202 10:59:33.006474 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="init" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.006482 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="init" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.007222 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9df7ea-8e3d-4c6e-a86d-793168703b18" containerName="dnsmasq-dns" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.007245 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15234c6-a14d-46ac-8cc6-d41b50d18589" containerName="nova-scheduler-scheduler" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.007262 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2b10de-90f9-49b1-aa3e-92508a93e8ee" containerName="nova-manage" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.008381 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.010243 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.013934 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.165236 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-config-data\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.165295 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.165573 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c79h\" (UniqueName: \"kubernetes.io/projected/c039a87a-6a86-4dc2-8760-363c1e2f62d8-kube-api-access-7c79h\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.267405 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c79h\" (UniqueName: \"kubernetes.io/projected/c039a87a-6a86-4dc2-8760-363c1e2f62d8-kube-api-access-7c79h\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.267603 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-config-data\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.267646 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.271821 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-config-data\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.282271 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c039a87a-6a86-4dc2-8760-363c1e2f62d8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.291195 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c79h\" (UniqueName: \"kubernetes.io/projected/c039a87a-6a86-4dc2-8760-363c1e2f62d8-kube-api-access-7c79h\") pod \"nova-scheduler-0\" (UID: \"c039a87a-6a86-4dc2-8760-363c1e2f62d8\") " pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.327373 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.691427 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15234c6-a14d-46ac-8cc6-d41b50d18589" path="/var/lib/kubelet/pods/d15234c6-a14d-46ac-8cc6-d41b50d18589/volumes" Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.793784 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:59:33 crc kubenswrapper[4901]: I0202 10:59:33.918650 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c039a87a-6a86-4dc2-8760-363c1e2f62d8","Type":"ContainerStarted","Data":"b215ce5ed184fd54073c07f35db45a1c1dada5b5a7e0b03b44500305273c3bc8"} Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.334766 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:35568->10.217.0.206:8775: read: connection reset by peer" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.334913 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:35578->10.217.0.206:8775: read: connection reset by peer" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.832429 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.935709 4901 generic.go:334] "Generic (PLEG): container finished" podID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerID="13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa" exitCode=0 Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.935785 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.935806 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerDied","Data":"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa"} Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.935866 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4825fb51-1602-4350-89ed-9bc8cea66c2c","Type":"ContainerDied","Data":"5942469404a218679d59b4804d1d605c66ec659fab9d0af4cbbd21fded32cae0"} Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.935889 4901 scope.go:117] "RemoveContainer" containerID="13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.938376 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c039a87a-6a86-4dc2-8760-363c1e2f62d8","Type":"ContainerStarted","Data":"6fa0033d8ab3fdaaeacefa14a4ac78781f653ac8d6d240541f368ca920a67e27"} Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.967237 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.967214464 podStartE2EDuration="2.967214464s" podCreationTimestamp="2026-02-02 10:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:34.956387408 +0000 UTC m=+1261.974727514" watchObservedRunningTime="2026-02-02 10:59:34.967214464 +0000 UTC m=+1261.985554560" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.970280 4901 scope.go:117] "RemoveContainer" containerID="5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.996921 4901 scope.go:117] "RemoveContainer" containerID="13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa" Feb 02 10:59:34 crc kubenswrapper[4901]: E0202 10:59:34.997650 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa\": container with ID starting with 13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa not found: ID does not exist" containerID="13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.997691 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa"} err="failed to get container status \"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa\": rpc error: code = NotFound desc = could not find container \"13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa\": container with ID starting with 13d03bbc6533e506ab4b10275c3b14f6a3dab9ddc41162e02fa1fac0e5cbb3fa not found: ID does not exist" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.997718 4901 scope.go:117] "RemoveContainer" containerID="5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0" Feb 02 10:59:34 crc kubenswrapper[4901]: E0202 10:59:34.998021 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0\": container with ID starting with 5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0 not found: ID does not exist" containerID="5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0" Feb 02 10:59:34 crc kubenswrapper[4901]: I0202 10:59:34.998044 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0"} err="failed to get container status \"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0\": rpc error: code = NotFound desc = could not find container \"5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0\": container with ID starting with 5072faf80b0390581b5506d7f66af631c77a96de5d23a7295c6b58ea5c6d31d0 not found: ID does not exist" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.006606 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdrvp\" (UniqueName: \"kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp\") pod \"4825fb51-1602-4350-89ed-9bc8cea66c2c\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.006891 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs\") pod \"4825fb51-1602-4350-89ed-9bc8cea66c2c\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.007032 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs\") pod \"4825fb51-1602-4350-89ed-9bc8cea66c2c\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.007096 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle\") pod \"4825fb51-1602-4350-89ed-9bc8cea66c2c\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.007127 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data\") pod \"4825fb51-1602-4350-89ed-9bc8cea66c2c\" (UID: \"4825fb51-1602-4350-89ed-9bc8cea66c2c\") " Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.011244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs" (OuterVolumeSpecName: "logs") pod "4825fb51-1602-4350-89ed-9bc8cea66c2c" (UID: "4825fb51-1602-4350-89ed-9bc8cea66c2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.016505 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp" (OuterVolumeSpecName: "kube-api-access-kdrvp") pod "4825fb51-1602-4350-89ed-9bc8cea66c2c" (UID: "4825fb51-1602-4350-89ed-9bc8cea66c2c"). InnerVolumeSpecName "kube-api-access-kdrvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.042218 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4825fb51-1602-4350-89ed-9bc8cea66c2c" (UID: "4825fb51-1602-4350-89ed-9bc8cea66c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.049212 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data" (OuterVolumeSpecName: "config-data") pod "4825fb51-1602-4350-89ed-9bc8cea66c2c" (UID: "4825fb51-1602-4350-89ed-9bc8cea66c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.071770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4825fb51-1602-4350-89ed-9bc8cea66c2c" (UID: "4825fb51-1602-4350-89ed-9bc8cea66c2c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.109754 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.109784 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.109794 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdrvp\" (UniqueName: \"kubernetes.io/projected/4825fb51-1602-4350-89ed-9bc8cea66c2c-kube-api-access-kdrvp\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.109804 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825fb51-1602-4350-89ed-9bc8cea66c2c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.109813 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825fb51-1602-4350-89ed-9bc8cea66c2c-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.342363 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.368061 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.391124 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:35 crc kubenswrapper[4901]: E0202 10:59:35.391685 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.391706 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" Feb 02 10:59:35 crc kubenswrapper[4901]: E0202 10:59:35.391738 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.391745 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.391969 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-metadata" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.392003 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" containerName="nova-metadata-log" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.393288 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.396381 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.396653 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.405051 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.572210 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4135f49b-1390-4600-8855-c9311c0cdf11-logs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.572317 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.572392 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjbh\" (UniqueName: \"kubernetes.io/projected/4135f49b-1390-4600-8855-c9311c0cdf11-kube-api-access-wgjbh\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.572460 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-config-data\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.572662 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.675097 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4135f49b-1390-4600-8855-c9311c0cdf11-logs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.675182 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.675220 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjbh\" (UniqueName: \"kubernetes.io/projected/4135f49b-1390-4600-8855-c9311c0cdf11-kube-api-access-wgjbh\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.675282 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-config-data\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.675340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.676925 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4135f49b-1390-4600-8855-c9311c0cdf11-logs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.681146 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.682270 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.697344 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4825fb51-1602-4350-89ed-9bc8cea66c2c" path="/var/lib/kubelet/pods/4825fb51-1602-4350-89ed-9bc8cea66c2c/volumes" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.698265 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4135f49b-1390-4600-8855-c9311c0cdf11-config-data\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.705814 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjbh\" (UniqueName: \"kubernetes.io/projected/4135f49b-1390-4600-8855-c9311c0cdf11-kube-api-access-wgjbh\") pod \"nova-metadata-0\" (UID: \"4135f49b-1390-4600-8855-c9311c0cdf11\") " pod="openstack/nova-metadata-0" Feb 02 10:59:35 crc kubenswrapper[4901]: I0202 10:59:35.762045 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.306957 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:59:36 crc kubenswrapper[4901]: E0202 10:59:36.911786 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8eb5214_26b8_472b_be31_2604d1f0a7a0.slice/crio-conmon-31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.967536 4901 generic.go:334] "Generic (PLEG): container finished" podID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerID="31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9" exitCode=0 Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.967612 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerDied","Data":"31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9"} Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.969577 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4135f49b-1390-4600-8855-c9311c0cdf11","Type":"ContainerStarted","Data":"3cc3c9bdf2e396945cc8da32aba6b4b54046e97ed97688bb15dbda160cc13375"} Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.969654 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4135f49b-1390-4600-8855-c9311c0cdf11","Type":"ContainerStarted","Data":"4ae9841c772e511a5f14a496ba0805c9fd1d9885a4ebfc0162b9ef5af275bcf9"} Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.969666 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4135f49b-1390-4600-8855-c9311c0cdf11","Type":"ContainerStarted","Data":"6fae9409f42b39cdeb75ee6eaf908a63db875e42e10990b9425a0022741bd0eb"} Feb 02 10:59:36 crc kubenswrapper[4901]: I0202 10:59:36.999703 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.999674625 podStartE2EDuration="1.999674625s" podCreationTimestamp="2026-02-02 10:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:36.99133982 +0000 UTC m=+1264.009679916" watchObservedRunningTime="2026-02-02 10:59:36.999674625 +0000 UTC m=+1264.018014721" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.066428 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.217941 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpxxv\" (UniqueName: \"kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.218742 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.218842 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.218980 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.219261 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs" (OuterVolumeSpecName: "logs") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.220379 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.220474 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle\") pod \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\" (UID: \"c8eb5214-26b8-472b-be31-2604d1f0a7a0\") " Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.221953 4901 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8eb5214-26b8-472b-be31-2604d1f0a7a0-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.227405 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv" (OuterVolumeSpecName: "kube-api-access-bpxxv") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "kube-api-access-bpxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.249391 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data" (OuterVolumeSpecName: "config-data") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.274223 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.286776 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.293620 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c8eb5214-26b8-472b-be31-2604d1f0a7a0" (UID: "c8eb5214-26b8-472b-be31-2604d1f0a7a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.324111 4901 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.324158 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.324172 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpxxv\" (UniqueName: \"kubernetes.io/projected/c8eb5214-26b8-472b-be31-2604d1f0a7a0-kube-api-access-bpxxv\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.324187 4901 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.324200 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8eb5214-26b8-472b-be31-2604d1f0a7a0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.838430 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.838627 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.838743 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.840286 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:59:37 crc kubenswrapper[4901]: I0202 10:59:37.840389 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231" gracePeriod=600 Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.010932 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8eb5214-26b8-472b-be31-2604d1f0a7a0","Type":"ContainerDied","Data":"5f74f7348c9557ddbbafe8f8460e4c009ce19e99dc861821124f910dbb04582c"} Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.011033 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.011124 4901 scope.go:117] "RemoveContainer" containerID="31d6ad075c89d67ba4472968a89231bf702b9b96399061ae9361d92e09272df9" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.017020 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231" exitCode=0 Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.017095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231"} Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.074785 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.090658 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.093986 4901 scope.go:117] "RemoveContainer" containerID="6a312e1ca26e3585548a8702c20996b52c437d7f455f3cd2e8431b7e2c21ebb9" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.123745 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:38 crc kubenswrapper[4901]: E0202 10:59:38.124758 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-api" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.124785 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-api" Feb 02 10:59:38 crc kubenswrapper[4901]: E0202 10:59:38.124826 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-log" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.124837 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-log" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.125038 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-api" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.125068 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" containerName="nova-api-log" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.126313 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.130194 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.130452 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.130965 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.153816 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.175384 4901 scope.go:117] "RemoveContainer" containerID="0796571e8bb156471bd27d2be38cbccd677937199c84fe164a09e32f6b2adf50" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.244985 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93285f41-662a-47d2-a011-676f740a2914-logs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.245141 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgkn\" (UniqueName: \"kubernetes.io/projected/93285f41-662a-47d2-a011-676f740a2914-kube-api-access-wcgkn\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.245197 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.245265 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-config-data\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.245386 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.245588 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-public-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.327802 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347529 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93285f41-662a-47d2-a011-676f740a2914-logs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347608 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgkn\" (UniqueName: \"kubernetes.io/projected/93285f41-662a-47d2-a011-676f740a2914-kube-api-access-wcgkn\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347626 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347649 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-config-data\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347677 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.347755 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-public-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.348076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93285f41-662a-47d2-a011-676f740a2914-logs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.353276 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.353716 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-public-tls-certs\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.370937 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.371323 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93285f41-662a-47d2-a011-676f740a2914-config-data\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.382437 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgkn\" (UniqueName: \"kubernetes.io/projected/93285f41-662a-47d2-a011-676f740a2914-kube-api-access-wcgkn\") pod \"nova-api-0\" (UID: \"93285f41-662a-47d2-a011-676f740a2914\") " pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.497259 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:59:38 crc kubenswrapper[4901]: I0202 10:59:38.963128 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:59:38 crc kubenswrapper[4901]: W0202 10:59:38.967642 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93285f41_662a_47d2_a011_676f740a2914.slice/crio-747e14da12f35ebeb78d068474666930b09801820907a5643d42cb521104e790 WatchSource:0}: Error finding container 747e14da12f35ebeb78d068474666930b09801820907a5643d42cb521104e790: Status 404 returned error can't find the container with id 747e14da12f35ebeb78d068474666930b09801820907a5643d42cb521104e790 Feb 02 10:59:39 crc kubenswrapper[4901]: I0202 10:59:39.029641 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93285f41-662a-47d2-a011-676f740a2914","Type":"ContainerStarted","Data":"747e14da12f35ebeb78d068474666930b09801820907a5643d42cb521104e790"} Feb 02 10:59:39 crc kubenswrapper[4901]: I0202 10:59:39.032437 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff"} Feb 02 10:59:39 crc kubenswrapper[4901]: I0202 10:59:39.689237 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8eb5214-26b8-472b-be31-2604d1f0a7a0" path="/var/lib/kubelet/pods/c8eb5214-26b8-472b-be31-2604d1f0a7a0/volumes" Feb 02 10:59:40 crc kubenswrapper[4901]: I0202 10:59:40.055789 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93285f41-662a-47d2-a011-676f740a2914","Type":"ContainerStarted","Data":"db35e94d97277014b71a9ad1ea39b1677c3f8291e38e8e323fd861a4266acd66"} Feb 02 10:59:40 crc kubenswrapper[4901]: I0202 10:59:40.056288 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93285f41-662a-47d2-a011-676f740a2914","Type":"ContainerStarted","Data":"bebe7e35ed4ecdc6364dfc7011920c311b1a72d11f68bec5065f42b0b909a390"} Feb 02 10:59:40 crc kubenswrapper[4901]: I0202 10:59:40.089861 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.089822388 podStartE2EDuration="2.089822388s" podCreationTimestamp="2026-02-02 10:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:59:40.078824908 +0000 UTC m=+1267.097165014" watchObservedRunningTime="2026-02-02 10:59:40.089822388 +0000 UTC m=+1267.108162514" Feb 02 10:59:40 crc kubenswrapper[4901]: I0202 10:59:40.762172 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:59:40 crc kubenswrapper[4901]: I0202 10:59:40.762235 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:59:43 crc kubenswrapper[4901]: I0202 10:59:43.328008 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:59:43 crc kubenswrapper[4901]: I0202 10:59:43.363164 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:59:44 crc kubenswrapper[4901]: I0202 10:59:44.149028 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:59:45 crc kubenswrapper[4901]: I0202 10:59:45.762710 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:59:45 crc kubenswrapper[4901]: I0202 10:59:45.763075 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:59:46 crc kubenswrapper[4901]: I0202 10:59:46.781092 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4135f49b-1390-4600-8855-c9311c0cdf11" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:46 crc kubenswrapper[4901]: I0202 10:59:46.781699 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4135f49b-1390-4600-8855-c9311c0cdf11" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:48 crc kubenswrapper[4901]: I0202 10:59:48.497907 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:48 crc kubenswrapper[4901]: I0202 10:59:48.498535 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:59:49 crc kubenswrapper[4901]: I0202 10:59:49.515868 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93285f41-662a-47d2-a011-676f740a2914" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:49 crc kubenswrapper[4901]: I0202 10:59:49.515943 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93285f41-662a-47d2-a011-676f740a2914" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:59:50 crc kubenswrapper[4901]: I0202 10:59:50.164815 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:59:55 crc kubenswrapper[4901]: I0202 10:59:55.773337 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:59:55 crc kubenswrapper[4901]: I0202 10:59:55.777690 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:59:55 crc kubenswrapper[4901]: I0202 10:59:55.786406 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:59:56 crc kubenswrapper[4901]: I0202 10:59:56.256110 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:59:58 crc kubenswrapper[4901]: I0202 10:59:58.510825 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:59:58 crc kubenswrapper[4901]: I0202 10:59:58.512298 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:59:58 crc kubenswrapper[4901]: I0202 10:59:58.514243 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:59:58 crc kubenswrapper[4901]: I0202 10:59:58.524225 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:59:59 crc kubenswrapper[4901]: I0202 10:59:59.280212 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:59:59 crc kubenswrapper[4901]: I0202 10:59:59.286559 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.176473 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql"] Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.178730 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.182674 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.183076 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.196894 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql"] Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.267213 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.267618 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.267775 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxq7c\" (UniqueName: \"kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.369971 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.370083 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.370128 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxq7c\" (UniqueName: \"kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.371365 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.376654 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.391495 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxq7c\" (UniqueName: \"kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c\") pod \"collect-profiles-29500500-7xwql\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.516035 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:00 crc kubenswrapper[4901]: I0202 11:00:00.986278 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql"] Feb 02 11:00:01 crc kubenswrapper[4901]: I0202 11:00:01.299856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" event={"ID":"e3db2272-ebdc-4bd3-8652-d0d46be00772","Type":"ContainerStarted","Data":"ac6053f444c70e8772050002e1dc959128fa29aa137594e0ae65c27dbdbc577a"} Feb 02 11:00:01 crc kubenswrapper[4901]: I0202 11:00:01.300193 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" event={"ID":"e3db2272-ebdc-4bd3-8652-d0d46be00772","Type":"ContainerStarted","Data":"c61ccfc33f92514b64e5afe856532dcc5571d784a4ddcd5738acbbca08dec1c1"} Feb 02 11:00:01 crc kubenswrapper[4901]: I0202 11:00:01.324694 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" podStartSLOduration=1.3246701619999999 podStartE2EDuration="1.324670162s" podCreationTimestamp="2026-02-02 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:00:01.317545497 +0000 UTC m=+1288.335885593" watchObservedRunningTime="2026-02-02 11:00:01.324670162 +0000 UTC m=+1288.343010258" Feb 02 11:00:02 crc kubenswrapper[4901]: I0202 11:00:02.329005 4901 generic.go:334] "Generic (PLEG): container finished" podID="e3db2272-ebdc-4bd3-8652-d0d46be00772" containerID="ac6053f444c70e8772050002e1dc959128fa29aa137594e0ae65c27dbdbc577a" exitCode=0 Feb 02 11:00:02 crc kubenswrapper[4901]: I0202 11:00:02.329062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" event={"ID":"e3db2272-ebdc-4bd3-8652-d0d46be00772","Type":"ContainerDied","Data":"ac6053f444c70e8772050002e1dc959128fa29aa137594e0ae65c27dbdbc577a"} Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.702681 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.739158 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxq7c\" (UniqueName: \"kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c\") pod \"e3db2272-ebdc-4bd3-8652-d0d46be00772\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.739431 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume\") pod \"e3db2272-ebdc-4bd3-8652-d0d46be00772\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.739503 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume\") pod \"e3db2272-ebdc-4bd3-8652-d0d46be00772\" (UID: \"e3db2272-ebdc-4bd3-8652-d0d46be00772\") " Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.740086 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3db2272-ebdc-4bd3-8652-d0d46be00772" (UID: "e3db2272-ebdc-4bd3-8652-d0d46be00772"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.744546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c" (OuterVolumeSpecName: "kube-api-access-lxq7c") pod "e3db2272-ebdc-4bd3-8652-d0d46be00772" (UID: "e3db2272-ebdc-4bd3-8652-d0d46be00772"). InnerVolumeSpecName "kube-api-access-lxq7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.744546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3db2272-ebdc-4bd3-8652-d0d46be00772" (UID: "e3db2272-ebdc-4bd3-8652-d0d46be00772"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.841727 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxq7c\" (UniqueName: \"kubernetes.io/projected/e3db2272-ebdc-4bd3-8652-d0d46be00772-kube-api-access-lxq7c\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.842011 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3db2272-ebdc-4bd3-8652-d0d46be00772-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4901]: I0202 11:00:03.842022 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3db2272-ebdc-4bd3-8652-d0d46be00772-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:04 crc kubenswrapper[4901]: I0202 11:00:04.352958 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" event={"ID":"e3db2272-ebdc-4bd3-8652-d0d46be00772","Type":"ContainerDied","Data":"c61ccfc33f92514b64e5afe856532dcc5571d784a4ddcd5738acbbca08dec1c1"} Feb 02 11:00:04 crc kubenswrapper[4901]: I0202 11:00:04.353301 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c61ccfc33f92514b64e5afe856532dcc5571d784a4ddcd5738acbbca08dec1c1" Feb 02 11:00:04 crc kubenswrapper[4901]: I0202 11:00:04.353067 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql" Feb 02 11:00:07 crc kubenswrapper[4901]: I0202 11:00:07.444041 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:08 crc kubenswrapper[4901]: I0202 11:00:08.168928 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:11 crc kubenswrapper[4901]: I0202 11:00:11.668024 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="rabbitmq" containerID="cri-o://2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d" gracePeriod=604796 Feb 02 11:00:12 crc kubenswrapper[4901]: I0202 11:00:12.534272 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="rabbitmq" containerID="cri-o://dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f" gracePeriod=604796 Feb 02 11:00:14 crc kubenswrapper[4901]: I0202 11:00:14.589190 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Feb 02 11:00:14 crc kubenswrapper[4901]: I0202 11:00:14.986712 4901 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Feb 02 11:00:18 crc kubenswrapper[4901]: E0202 11:00:18.024765 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24aaea2c_a1d1_41ff_a7bc_a1ebfabf06ba.slice/crio-conmon-2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24aaea2c_a1d1_41ff_a7bc_a1ebfabf06ba.slice/crio-2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.320878 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.432771 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433057 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433108 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433146 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433207 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433229 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433298 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433333 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433405 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhr9n\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433424 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433538 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret\") pod \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\" (UID: \"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba\") " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.433939 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.434025 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.434550 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.434975 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.440108 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.443321 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n" (OuterVolumeSpecName: "kube-api-access-zhr9n") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "kube-api-access-zhr9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.464686 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.469069 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info" (OuterVolumeSpecName: "pod-info") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.470845 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.490029 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data" (OuterVolumeSpecName: "config-data") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.525525 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf" (OuterVolumeSpecName: "server-conf") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538158 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhr9n\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-kube-api-access-zhr9n\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538194 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538213 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538222 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538230 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538261 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538270 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538279 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.538293 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.575796 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.577721 4901 generic.go:334] "Generic (PLEG): container finished" podID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerID="2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d" exitCode=0 Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.577876 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.577937 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerDied","Data":"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d"} Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.577964 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba","Type":"ContainerDied","Data":"4f4405fae4718e8fe058b899df307f15ba42dffc224ee3c84fbfec9dc833150f"} Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.577981 4901 scope.go:117] "RemoveContainer" containerID="2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.603658 4901 scope.go:117] "RemoveContainer" containerID="a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.634259 4901 scope.go:117] "RemoveContainer" containerID="2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.634712 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" (UID: "24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:18 crc kubenswrapper[4901]: E0202 11:00:18.634736 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d\": container with ID starting with 2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d not found: ID does not exist" containerID="2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.634773 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d"} err="failed to get container status \"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d\": rpc error: code = NotFound desc = could not find container \"2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d\": container with ID starting with 2a467b9aafc4d75aab9dae67fe89328c83b39e187a996d2aba454a542a47ac2d not found: ID does not exist" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.634804 4901 scope.go:117] "RemoveContainer" containerID="a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab" Feb 02 11:00:18 crc kubenswrapper[4901]: E0202 11:00:18.635269 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab\": container with ID starting with a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab not found: ID does not exist" containerID="a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.635288 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab"} err="failed to get container status \"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab\": rpc error: code = NotFound desc = could not find container \"a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab\": container with ID starting with a9e601667ecf3eb63fc82c38dd3226c81472105dfa7af588e3ddfba7431490ab not found: ID does not exist" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.640296 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.640435 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:18 crc kubenswrapper[4901]: I0202 11:00:18.984416 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.021025 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.043674 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.044485 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="setup-container" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.044507 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="setup-container" Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.044528 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3db2272-ebdc-4bd3-8652-d0d46be00772" containerName="collect-profiles" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.044537 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3db2272-ebdc-4bd3-8652-d0d46be00772" containerName="collect-profiles" Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.044610 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.044620 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.044864 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3db2272-ebdc-4bd3-8652-d0d46be00772" containerName="collect-profiles" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.044890 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.046302 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.050260 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.050350 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.050282 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.050572 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.050821 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.051065 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.051228 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-brdmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.076675 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.123438 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157262 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157417 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157541 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-config-data\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157627 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157784 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157838 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157881 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157937 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cacb5793-beb9-49f9-9438-9613ad472c15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.157972 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cacb5793-beb9-49f9-9438-9613ad472c15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.158210 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.158531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bnq\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-kube-api-access-q7bnq\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260075 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260252 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260292 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260457 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260502 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260607 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260700 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260737 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260744 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260833 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260865 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqnvw\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.260960 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf\") pod \"942c6932-383e-432a-b927-ff9ec4ac81cb\" (UID: \"942c6932-383e-432a-b927-ff9ec4ac81cb\") " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261368 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-config-data\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261421 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261491 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261526 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261611 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261659 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cacb5793-beb9-49f9-9438-9613ad472c15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261700 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cacb5793-beb9-49f9-9438-9613ad472c15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261807 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261880 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bnq\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-kube-api-access-q7bnq\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.261975 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.262007 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.262109 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.263000 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.263067 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.263171 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.264024 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.264098 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.264882 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.266213 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.268128 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cacb5793-beb9-49f9-9438-9613ad472c15-config-data\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.269280 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw" (OuterVolumeSpecName: "kube-api-access-mqnvw") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "kube-api-access-mqnvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.269404 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.269490 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info" (OuterVolumeSpecName: "pod-info") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.269801 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.269988 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.270436 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cacb5793-beb9-49f9-9438-9613ad472c15-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.283193 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.283855 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cacb5793-beb9-49f9-9438-9613ad472c15-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.284756 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.302373 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bnq\" (UniqueName: \"kubernetes.io/projected/cacb5793-beb9-49f9-9438-9613ad472c15-kube-api-access-q7bnq\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.320812 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data" (OuterVolumeSpecName: "config-data") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.362685 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"cacb5793-beb9-49f9-9438-9613ad472c15\") " pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364443 4901 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364481 4901 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/942c6932-383e-432a-b927-ff9ec4ac81cb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364492 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364503 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqnvw\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-kube-api-access-mqnvw\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364514 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364523 4901 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/942c6932-383e-432a-b927-ff9ec4ac81cb-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364531 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.364579 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.378414 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf" (OuterVolumeSpecName: "server-conf") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.401590 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.434647 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.466283 4901 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/942c6932-383e-432a-b927-ff9ec4ac81cb-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.466320 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.501326 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "942c6932-383e-432a-b927-ff9ec4ac81cb" (UID: "942c6932-383e-432a-b927-ff9ec4ac81cb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.570453 4901 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/942c6932-383e-432a-b927-ff9ec4ac81cb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.621632 4901 generic.go:334] "Generic (PLEG): container finished" podID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerID="dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f" exitCode=0 Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.621679 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerDied","Data":"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f"} Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.621711 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"942c6932-383e-432a-b927-ff9ec4ac81cb","Type":"ContainerDied","Data":"42360ba26c45137b286d35f556b3f1f16ab183c27aa68712ac5f83e021c40657"} Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.621729 4901 scope.go:117] "RemoveContainer" containerID="dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.621802 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.656630 4901 scope.go:117] "RemoveContainer" containerID="db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.721875 4901 scope.go:117] "RemoveContainer" containerID="dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f" Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.722356 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f\": container with ID starting with dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f not found: ID does not exist" containerID="dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.722389 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f"} err="failed to get container status \"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f\": rpc error: code = NotFound desc = could not find container \"dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f\": container with ID starting with dd84a1510ee1d58f8154205305c524591576ab844656c2e03acf69b375038a4f not found: ID does not exist" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.722407 4901 scope.go:117] "RemoveContainer" containerID="db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2" Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.722736 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2\": container with ID starting with db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2 not found: ID does not exist" containerID="db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.722774 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2"} err="failed to get container status \"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2\": rpc error: code = NotFound desc = could not find container \"db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2\": container with ID starting with db41cd112888fcb37424d3a85cf77cf5ae370c5f0dce98ed34bcb98fcdcecad2 not found: ID does not exist" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.723823 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba" path="/var/lib/kubelet/pods/24aaea2c-a1d1-41ff-a7bc-a1ebfabf06ba/volumes" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.724673 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.724711 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.736640 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.737298 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.737314 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: E0202 11:00:19.737349 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="setup-container" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.737356 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="setup-container" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.737620 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" containerName="rabbitmq" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.739030 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.745853 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.745922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mm74n" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.745979 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.746219 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.746312 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.746409 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.746545 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.771586 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775250 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e36bedd2-6698-4981-b0cf-a278a9ce7258-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775314 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775353 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775378 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-kube-api-access-bfvdg\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775436 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775468 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e36bedd2-6698-4981-b0cf-a278a9ce7258-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775573 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775623 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775654 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.775707 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e36bedd2-6698-4981-b0cf-a278a9ce7258-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877468 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877485 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877506 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-kube-api-access-bfvdg\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877540 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877605 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e36bedd2-6698-4981-b0cf-a278a9ce7258-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877645 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877666 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.877724 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.878236 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.878749 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.879017 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.879313 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.882672 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.883643 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e36bedd2-6698-4981-b0cf-a278a9ce7258-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.888902 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e36bedd2-6698-4981-b0cf-a278a9ce7258-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.889344 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e36bedd2-6698-4981-b0cf-a278a9ce7258-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.889978 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.896860 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.907355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvdg\" (UniqueName: \"kubernetes.io/projected/e36bedd2-6698-4981-b0cf-a278a9ce7258-kube-api-access-bfvdg\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:19 crc kubenswrapper[4901]: I0202 11:00:19.917350 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e36bedd2-6698-4981-b0cf-a278a9ce7258\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:20 crc kubenswrapper[4901]: I0202 11:00:20.049230 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:00:20 crc kubenswrapper[4901]: I0202 11:00:20.068875 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:20 crc kubenswrapper[4901]: I0202 11:00:20.638230 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:00:20 crc kubenswrapper[4901]: I0202 11:00:20.643087 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e36bedd2-6698-4981-b0cf-a278a9ce7258","Type":"ContainerStarted","Data":"4be0992b67af79f43a2397cbd8ada7140340e7bbbfcc82ae09a07bcd72c57d10"} Feb 02 11:00:20 crc kubenswrapper[4901]: I0202 11:00:20.645244 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cacb5793-beb9-49f9-9438-9613ad472c15","Type":"ContainerStarted","Data":"46cddeae13e2d00b04385c0df3063189c5f570f53168ad6b9b08b3dadff8fe63"} Feb 02 11:00:21 crc kubenswrapper[4901]: I0202 11:00:21.700722 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="942c6932-383e-432a-b927-ff9ec4ac81cb" path="/var/lib/kubelet/pods/942c6932-383e-432a-b927-ff9ec4ac81cb/volumes" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.120666 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.122292 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.124480 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.141363 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.241910 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242097 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242156 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgf4\" (UniqueName: \"kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242300 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242360 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242474 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.242604 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344234 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344328 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344372 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgf4\" (UniqueName: \"kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344405 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344427 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344454 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.344474 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.345494 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.345529 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.345637 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.345987 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.346152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.346156 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.380066 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgf4\" (UniqueName: \"kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4\") pod \"dnsmasq-dns-5b75489c6f-kfhrg\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.447140 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.733551 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cacb5793-beb9-49f9-9438-9613ad472c15","Type":"ContainerStarted","Data":"0a34a502d00369275525a7a835f3c4ccfbf1c0119e9953c35e814877661c155c"} Feb 02 11:00:22 crc kubenswrapper[4901]: I0202 11:00:22.907377 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:23 crc kubenswrapper[4901]: I0202 11:00:23.768110 4901 generic.go:334] "Generic (PLEG): container finished" podID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerID="98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6" exitCode=0 Feb 02 11:00:23 crc kubenswrapper[4901]: I0202 11:00:23.768494 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" event={"ID":"7feaaf6f-027e-46c0-8030-fe3c3e358b73","Type":"ContainerDied","Data":"98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6"} Feb 02 11:00:23 crc kubenswrapper[4901]: I0202 11:00:23.768527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" event={"ID":"7feaaf6f-027e-46c0-8030-fe3c3e358b73","Type":"ContainerStarted","Data":"7a7fa245ef884a2f7ed50553271c0dff14f5ee1b03f15399169bf192e982ef04"} Feb 02 11:00:23 crc kubenswrapper[4901]: I0202 11:00:23.785245 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e36bedd2-6698-4981-b0cf-a278a9ce7258","Type":"ContainerStarted","Data":"78a1eec6c6ed07e47809e332a6ced1abf91df411b06868d1d2467ccb392b568c"} Feb 02 11:00:24 crc kubenswrapper[4901]: I0202 11:00:24.796847 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" event={"ID":"7feaaf6f-027e-46c0-8030-fe3c3e358b73","Type":"ContainerStarted","Data":"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5"} Feb 02 11:00:24 crc kubenswrapper[4901]: I0202 11:00:24.830379 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" podStartSLOduration=2.830351937 podStartE2EDuration="2.830351937s" podCreationTimestamp="2026-02-02 11:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:00:24.818365842 +0000 UTC m=+1311.836705958" watchObservedRunningTime="2026-02-02 11:00:24.830351937 +0000 UTC m=+1311.848692063" Feb 02 11:00:25 crc kubenswrapper[4901]: I0202 11:00:25.805015 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.450004 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.505712 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.506078 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="dnsmasq-dns" containerID="cri-o://a1d82a38a53c6876ac9b06cdddf10117c78cf4b093c7c0f3e06ab4e6613680a2" gracePeriod=10 Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.691835 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-6pmsr"] Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.693713 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.733434 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-6pmsr"] Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.779883 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780425 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-config\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780500 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjtk\" (UniqueName: \"kubernetes.io/projected/b0c42695-0f2a-43f2-925b-90f704255c79-kube-api-access-wcjtk\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780521 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780549 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780689 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.780745 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.880197 4901 generic.go:334] "Generic (PLEG): container finished" podID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerID="a1d82a38a53c6876ac9b06cdddf10117c78cf4b093c7c0f3e06ab4e6613680a2" exitCode=0 Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.880279 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" event={"ID":"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66","Type":"ContainerDied","Data":"a1d82a38a53c6876ac9b06cdddf10117c78cf4b093c7c0f3e06ab4e6613680a2"} Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883488 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883698 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883734 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-config\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883849 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjtk\" (UniqueName: \"kubernetes.io/projected/b0c42695-0f2a-43f2-925b-90f704255c79-kube-api-access-wcjtk\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883883 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883933 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.883972 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.885420 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-svc\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.885916 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-dns-swift-storage-0\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.886319 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.886704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-sb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.886757 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-ovsdbserver-nb\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.887214 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c42695-0f2a-43f2-925b-90f704255c79-config\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:32 crc kubenswrapper[4901]: I0202 11:00:32.907463 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjtk\" (UniqueName: \"kubernetes.io/projected/b0c42695-0f2a-43f2-925b-90f704255c79-kube-api-access-wcjtk\") pod \"dnsmasq-dns-5d75f767dc-6pmsr\" (UID: \"b0c42695-0f2a-43f2-925b-90f704255c79\") " pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.021234 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.141845 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189241 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmvw7\" (UniqueName: \"kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189328 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189379 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189414 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189498 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.189619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb\") pod \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\" (UID: \"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66\") " Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.203989 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7" (OuterVolumeSpecName: "kube-api-access-bmvw7") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "kube-api-access-bmvw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.260282 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config" (OuterVolumeSpecName: "config") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.268167 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.291797 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmvw7\" (UniqueName: \"kubernetes.io/projected/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-kube-api-access-bmvw7\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.291822 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.291831 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.308734 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.310151 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.325189 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" (UID: "3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.393337 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.393385 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.393400 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.541986 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d75f767dc-6pmsr"] Feb 02 11:00:33 crc kubenswrapper[4901]: W0202 11:00:33.553169 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0c42695_0f2a_43f2_925b_90f704255c79.slice/crio-148e5a75fec2675803d48ed429de3708493dc7d5d248ab73ef355b2676a5ecfb WatchSource:0}: Error finding container 148e5a75fec2675803d48ed429de3708493dc7d5d248ab73ef355b2676a5ecfb: Status 404 returned error can't find the container with id 148e5a75fec2675803d48ed429de3708493dc7d5d248ab73ef355b2676a5ecfb Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.894916 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" event={"ID":"b0c42695-0f2a-43f2-925b-90f704255c79","Type":"ContainerStarted","Data":"51183f1a1386d068316a6e17ce143b06384b28eac712a49dab1039d368193a9e"} Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.895016 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" event={"ID":"b0c42695-0f2a-43f2-925b-90f704255c79","Type":"ContainerStarted","Data":"148e5a75fec2675803d48ed429de3708493dc7d5d248ab73ef355b2676a5ecfb"} Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.898026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" event={"ID":"3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66","Type":"ContainerDied","Data":"e3e510a2b500976500995a9655a09783a538321ef031da3117ba8051ea39ea46"} Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.898069 4901 scope.go:117] "RemoveContainer" containerID="a1d82a38a53c6876ac9b06cdddf10117c78cf4b093c7c0f3e06ab4e6613680a2" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.898205 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84f9ccf-8p6jl" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.929498 4901 scope.go:117] "RemoveContainer" containerID="69fd824466b8f659e0de1a9dffceb3cceab63672a236a5fdbaede31271bc812a" Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.961354 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 11:00:33 crc kubenswrapper[4901]: I0202 11:00:33.977281 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84f9ccf-8p6jl"] Feb 02 11:00:34 crc kubenswrapper[4901]: I0202 11:00:34.916084 4901 generic.go:334] "Generic (PLEG): container finished" podID="b0c42695-0f2a-43f2-925b-90f704255c79" containerID="51183f1a1386d068316a6e17ce143b06384b28eac712a49dab1039d368193a9e" exitCode=0 Feb 02 11:00:34 crc kubenswrapper[4901]: I0202 11:00:34.916142 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" event={"ID":"b0c42695-0f2a-43f2-925b-90f704255c79","Type":"ContainerDied","Data":"51183f1a1386d068316a6e17ce143b06384b28eac712a49dab1039d368193a9e"} Feb 02 11:00:35 crc kubenswrapper[4901]: I0202 11:00:35.685555 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" path="/var/lib/kubelet/pods/3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66/volumes" Feb 02 11:00:35 crc kubenswrapper[4901]: I0202 11:00:35.928488 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" event={"ID":"b0c42695-0f2a-43f2-925b-90f704255c79","Type":"ContainerStarted","Data":"e1f2045809527288e6092752abc01f8493ace993a9a51c2b44e86ddf65aad110"} Feb 02 11:00:35 crc kubenswrapper[4901]: I0202 11:00:35.928775 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:35 crc kubenswrapper[4901]: I0202 11:00:35.959637 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" podStartSLOduration=3.95957228 podStartE2EDuration="3.95957228s" podCreationTimestamp="2026-02-02 11:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:00:35.954389603 +0000 UTC m=+1322.972729699" watchObservedRunningTime="2026-02-02 11:00:35.95957228 +0000 UTC m=+1322.977912396" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.023796 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d75f767dc-6pmsr" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.113303 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.114841 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="dnsmasq-dns" containerID="cri-o://ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5" gracePeriod=10 Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.592229 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670007 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670096 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670169 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhgf4\" (UniqueName: \"kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670267 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670322 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670415 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.670464 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config\") pod \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\" (UID: \"7feaaf6f-027e-46c0-8030-fe3c3e358b73\") " Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.679991 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4" (OuterVolumeSpecName: "kube-api-access-hhgf4") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "kube-api-access-hhgf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.736507 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.736531 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.740109 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.742359 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.746719 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config" (OuterVolumeSpecName: "config") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.749150 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7feaaf6f-027e-46c0-8030-fe3c3e358b73" (UID: "7feaaf6f-027e-46c0-8030-fe3c3e358b73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773444 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773486 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773500 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhgf4\" (UniqueName: \"kubernetes.io/projected/7feaaf6f-027e-46c0-8030-fe3c3e358b73-kube-api-access-hhgf4\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773516 4901 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773532 4901 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773544 4901 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:43 crc kubenswrapper[4901]: I0202 11:00:43.773556 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7feaaf6f-027e-46c0-8030-fe3c3e358b73-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.016682 4901 generic.go:334] "Generic (PLEG): container finished" podID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerID="ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5" exitCode=0 Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.016755 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" event={"ID":"7feaaf6f-027e-46c0-8030-fe3c3e358b73","Type":"ContainerDied","Data":"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5"} Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.016805 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" event={"ID":"7feaaf6f-027e-46c0-8030-fe3c3e358b73","Type":"ContainerDied","Data":"7a7fa245ef884a2f7ed50553271c0dff14f5ee1b03f15399169bf192e982ef04"} Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.016835 4901 scope.go:117] "RemoveContainer" containerID="ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.017111 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b75489c6f-kfhrg" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.072203 4901 scope.go:117] "RemoveContainer" containerID="98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.097319 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.098116 4901 scope.go:117] "RemoveContainer" containerID="ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5" Feb 02 11:00:44 crc kubenswrapper[4901]: E0202 11:00:44.098554 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5\": container with ID starting with ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5 not found: ID does not exist" containerID="ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.098617 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5"} err="failed to get container status \"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5\": rpc error: code = NotFound desc = could not find container \"ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5\": container with ID starting with ca379196ebb0520e751e51b4f7730ecab0b905f17801bfa874c4059f6081f5f5 not found: ID does not exist" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.098641 4901 scope.go:117] "RemoveContainer" containerID="98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6" Feb 02 11:00:44 crc kubenswrapper[4901]: E0202 11:00:44.099201 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6\": container with ID starting with 98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6 not found: ID does not exist" containerID="98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.099253 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6"} err="failed to get container status \"98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6\": rpc error: code = NotFound desc = could not find container \"98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6\": container with ID starting with 98ebc78c45fc3856405d7b829279ba0fa20a590315f68597dd890ccbb51d17a6 not found: ID does not exist" Feb 02 11:00:44 crc kubenswrapper[4901]: I0202 11:00:44.111662 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b75489c6f-kfhrg"] Feb 02 11:00:45 crc kubenswrapper[4901]: I0202 11:00:45.689959 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" path="/var/lib/kubelet/pods/7feaaf6f-027e-46c0-8030-fe3c3e358b73/volumes" Feb 02 11:00:55 crc kubenswrapper[4901]: I0202 11:00:55.153208 4901 generic.go:334] "Generic (PLEG): container finished" podID="e36bedd2-6698-4981-b0cf-a278a9ce7258" containerID="78a1eec6c6ed07e47809e332a6ced1abf91df411b06868d1d2467ccb392b568c" exitCode=0 Feb 02 11:00:55 crc kubenswrapper[4901]: I0202 11:00:55.153292 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e36bedd2-6698-4981-b0cf-a278a9ce7258","Type":"ContainerDied","Data":"78a1eec6c6ed07e47809e332a6ced1abf91df411b06868d1d2467ccb392b568c"} Feb 02 11:00:55 crc kubenswrapper[4901]: I0202 11:00:55.156368 4901 generic.go:334] "Generic (PLEG): container finished" podID="cacb5793-beb9-49f9-9438-9613ad472c15" containerID="0a34a502d00369275525a7a835f3c4ccfbf1c0119e9953c35e814877661c155c" exitCode=0 Feb 02 11:00:55 crc kubenswrapper[4901]: I0202 11:00:55.156400 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cacb5793-beb9-49f9-9438-9613ad472c15","Type":"ContainerDied","Data":"0a34a502d00369275525a7a835f3c4ccfbf1c0119e9953c35e814877661c155c"} Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.180985 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e36bedd2-6698-4981-b0cf-a278a9ce7258","Type":"ContainerStarted","Data":"6c5a4d7757ef78142d4ee502d094bc81d452fd16236bcb3d2ab2c15b12f091f9"} Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.183089 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.193032 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cacb5793-beb9-49f9-9438-9613ad472c15","Type":"ContainerStarted","Data":"5f7b791320c51283e76f0f24d28866310079e4c3d114502b24ac955d9e72bf65"} Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.194007 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.225483 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.225459167 podStartE2EDuration="37.225459167s" podCreationTimestamp="2026-02-02 11:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:00:56.216152258 +0000 UTC m=+1343.234492394" watchObservedRunningTime="2026-02-02 11:00:56.225459167 +0000 UTC m=+1343.243799283" Feb 02 11:00:56 crc kubenswrapper[4901]: I0202 11:00:56.269659 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.269635971 podStartE2EDuration="38.269635971s" podCreationTimestamp="2026-02-02 11:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:00:56.255311139 +0000 UTC m=+1343.273651245" watchObservedRunningTime="2026-02-02 11:00:56.269635971 +0000 UTC m=+1343.287976067" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.851241 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t"] Feb 02 11:00:59 crc kubenswrapper[4901]: E0202 11:00:59.852195 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="init" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852209 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="init" Feb 02 11:00:59 crc kubenswrapper[4901]: E0202 11:00:59.852222 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="init" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852228 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="init" Feb 02 11:00:59 crc kubenswrapper[4901]: E0202 11:00:59.852239 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852245 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: E0202 11:00:59.852261 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852270 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852488 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7feaaf6f-027e-46c0-8030-fe3c3e358b73" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.852508 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b18aeb4-f6a2-4bd0-8fa0-1cba2f986c66" containerName="dnsmasq-dns" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.853265 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.860978 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.861047 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.861514 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.873626 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.880626 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t"] Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.938833 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.939021 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.939081 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:00:59 crc kubenswrapper[4901]: I0202 11:00:59.939105 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572cs\" (UniqueName: \"kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.041472 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.041581 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.041632 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.041668 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572cs\" (UniqueName: \"kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.048101 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.051127 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.051344 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.061245 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572cs\" (UniqueName: \"kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.145834 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500501-cwls7"] Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.147185 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.165551 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500501-cwls7"] Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.181958 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.246256 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.246721 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.246791 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l89mb\" (UniqueName: \"kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.246872 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.348694 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.348763 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l89mb\" (UniqueName: \"kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.348842 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.348993 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.355730 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.359049 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.359587 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.367601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l89mb\" (UniqueName: \"kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb\") pod \"keystone-cron-29500501-cwls7\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:00 crc kubenswrapper[4901]: I0202 11:01:00.485899 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:01 crc kubenswrapper[4901]: I0202 11:01:01.231585 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t"] Feb 02 11:01:01 crc kubenswrapper[4901]: I0202 11:01:01.254813 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" event={"ID":"ec5fa81a-9fd8-4adf-886a-2874c6883bc0","Type":"ContainerStarted","Data":"876722f9d048f912b6dc6b945008d4a4b338385ec217e306071984027b520968"} Feb 02 11:01:01 crc kubenswrapper[4901]: I0202 11:01:01.265911 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500501-cwls7"] Feb 02 11:01:01 crc kubenswrapper[4901]: W0202 11:01:01.271223 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod294b9c02_1398_4377_bb01_27ff64ba9c08.slice/crio-4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81 WatchSource:0}: Error finding container 4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81: Status 404 returned error can't find the container with id 4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81 Feb 02 11:01:02 crc kubenswrapper[4901]: I0202 11:01:02.266430 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-cwls7" event={"ID":"294b9c02-1398-4377-bb01-27ff64ba9c08","Type":"ContainerStarted","Data":"773fd82989b043d29ea7fc3f30d05891934cfa63e12d10ecd40f30c9f840325f"} Feb 02 11:01:02 crc kubenswrapper[4901]: I0202 11:01:02.266740 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-cwls7" event={"ID":"294b9c02-1398-4377-bb01-27ff64ba9c08","Type":"ContainerStarted","Data":"4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81"} Feb 02 11:01:02 crc kubenswrapper[4901]: I0202 11:01:02.289916 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500501-cwls7" podStartSLOduration=2.2898937569999998 podStartE2EDuration="2.289893757s" podCreationTimestamp="2026-02-02 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:01:02.280749003 +0000 UTC m=+1349.299089099" watchObservedRunningTime="2026-02-02 11:01:02.289893757 +0000 UTC m=+1349.308233853" Feb 02 11:01:04 crc kubenswrapper[4901]: I0202 11:01:04.285770 4901 generic.go:334] "Generic (PLEG): container finished" podID="294b9c02-1398-4377-bb01-27ff64ba9c08" containerID="773fd82989b043d29ea7fc3f30d05891934cfa63e12d10ecd40f30c9f840325f" exitCode=0 Feb 02 11:01:04 crc kubenswrapper[4901]: I0202 11:01:04.285839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-cwls7" event={"ID":"294b9c02-1398-4377-bb01-27ff64ba9c08","Type":"ContainerDied","Data":"773fd82989b043d29ea7fc3f30d05891934cfa63e12d10ecd40f30c9f840325f"} Feb 02 11:01:09 crc kubenswrapper[4901]: I0202 11:01:09.439848 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 11:01:10 crc kubenswrapper[4901]: I0202 11:01:10.074930 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.786892 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.964669 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys\") pod \"294b9c02-1398-4377-bb01-27ff64ba9c08\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.965052 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle\") pod \"294b9c02-1398-4377-bb01-27ff64ba9c08\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.965190 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l89mb\" (UniqueName: \"kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb\") pod \"294b9c02-1398-4377-bb01-27ff64ba9c08\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.965244 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data\") pod \"294b9c02-1398-4377-bb01-27ff64ba9c08\" (UID: \"294b9c02-1398-4377-bb01-27ff64ba9c08\") " Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.969227 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb" (OuterVolumeSpecName: "kube-api-access-l89mb") pod "294b9c02-1398-4377-bb01-27ff64ba9c08" (UID: "294b9c02-1398-4377-bb01-27ff64ba9c08"). InnerVolumeSpecName "kube-api-access-l89mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.973188 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "294b9c02-1398-4377-bb01-27ff64ba9c08" (UID: "294b9c02-1398-4377-bb01-27ff64ba9c08"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:12 crc kubenswrapper[4901]: I0202 11:01:12.992067 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "294b9c02-1398-4377-bb01-27ff64ba9c08" (UID: "294b9c02-1398-4377-bb01-27ff64ba9c08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.018608 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data" (OuterVolumeSpecName: "config-data") pod "294b9c02-1398-4377-bb01-27ff64ba9c08" (UID: "294b9c02-1398-4377-bb01-27ff64ba9c08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.067889 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.067920 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l89mb\" (UniqueName: \"kubernetes.io/projected/294b9c02-1398-4377-bb01-27ff64ba9c08-kube-api-access-l89mb\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.067930 4901 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.067938 4901 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/294b9c02-1398-4377-bb01-27ff64ba9c08-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.407452 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" event={"ID":"ec5fa81a-9fd8-4adf-886a-2874c6883bc0","Type":"ContainerStarted","Data":"3f43ef740cb98da6c199f44bedf519d7ba667cb6f4902a2236964e2d1e6e9fbf"} Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.412030 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-cwls7" event={"ID":"294b9c02-1398-4377-bb01-27ff64ba9c08","Type":"ContainerDied","Data":"4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81"} Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.412183 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee2c43d4cebf9eef1ee149741e818bc2ec6b60d4291e7e744189fc47d913f81" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.412098 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-cwls7" Feb 02 11:01:13 crc kubenswrapper[4901]: I0202 11:01:13.424779 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" podStartSLOduration=3.077562522 podStartE2EDuration="14.424755779s" podCreationTimestamp="2026-02-02 11:00:59 +0000 UTC" firstStartedPulling="2026-02-02 11:01:01.23044762 +0000 UTC m=+1348.248787716" lastFinishedPulling="2026-02-02 11:01:12.577640877 +0000 UTC m=+1359.595980973" observedRunningTime="2026-02-02 11:01:13.421315384 +0000 UTC m=+1360.439655480" watchObservedRunningTime="2026-02-02 11:01:13.424755779 +0000 UTC m=+1360.443095865" Feb 02 11:01:24 crc kubenswrapper[4901]: I0202 11:01:24.530943 4901 generic.go:334] "Generic (PLEG): container finished" podID="ec5fa81a-9fd8-4adf-886a-2874c6883bc0" containerID="3f43ef740cb98da6c199f44bedf519d7ba667cb6f4902a2236964e2d1e6e9fbf" exitCode=0 Feb 02 11:01:24 crc kubenswrapper[4901]: I0202 11:01:24.531038 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" event={"ID":"ec5fa81a-9fd8-4adf-886a-2874c6883bc0","Type":"ContainerDied","Data":"3f43ef740cb98da6c199f44bedf519d7ba667cb6f4902a2236964e2d1e6e9fbf"} Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.072308 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.175787 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam\") pod \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.176104 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory\") pod \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.176186 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle\") pod \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.176395 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572cs\" (UniqueName: \"kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs\") pod \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\" (UID: \"ec5fa81a-9fd8-4adf-886a-2874c6883bc0\") " Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.182119 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ec5fa81a-9fd8-4adf-886a-2874c6883bc0" (UID: "ec5fa81a-9fd8-4adf-886a-2874c6883bc0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.182555 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs" (OuterVolumeSpecName: "kube-api-access-572cs") pod "ec5fa81a-9fd8-4adf-886a-2874c6883bc0" (UID: "ec5fa81a-9fd8-4adf-886a-2874c6883bc0"). InnerVolumeSpecName "kube-api-access-572cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.209940 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory" (OuterVolumeSpecName: "inventory") pod "ec5fa81a-9fd8-4adf-886a-2874c6883bc0" (UID: "ec5fa81a-9fd8-4adf-886a-2874c6883bc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.229007 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec5fa81a-9fd8-4adf-886a-2874c6883bc0" (UID: "ec5fa81a-9fd8-4adf-886a-2874c6883bc0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.278851 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572cs\" (UniqueName: \"kubernetes.io/projected/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-kube-api-access-572cs\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.278905 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.278923 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.278935 4901 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5fa81a-9fd8-4adf-886a-2874c6883bc0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.559656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" event={"ID":"ec5fa81a-9fd8-4adf-886a-2874c6883bc0","Type":"ContainerDied","Data":"876722f9d048f912b6dc6b945008d4a4b338385ec217e306071984027b520968"} Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.559712 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876722f9d048f912b6dc6b945008d4a4b338385ec217e306071984027b520968" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.559726 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.743502 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg"] Feb 02 11:01:26 crc kubenswrapper[4901]: E0202 11:01:26.744112 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5fa81a-9fd8-4adf-886a-2874c6883bc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.744140 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5fa81a-9fd8-4adf-886a-2874c6883bc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:26 crc kubenswrapper[4901]: E0202 11:01:26.744170 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294b9c02-1398-4377-bb01-27ff64ba9c08" containerName="keystone-cron" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.744183 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="294b9c02-1398-4377-bb01-27ff64ba9c08" containerName="keystone-cron" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.744452 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5fa81a-9fd8-4adf-886a-2874c6883bc0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.744497 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="294b9c02-1398-4377-bb01-27ff64ba9c08" containerName="keystone-cron" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.745333 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.748704 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.748977 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.749130 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.749254 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.760467 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg"] Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.898485 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.898905 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:26 crc kubenswrapper[4901]: I0202 11:01:26.898966 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dqk7\" (UniqueName: \"kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.001467 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.001610 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.001636 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dqk7\" (UniqueName: \"kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.005932 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.006183 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.025529 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dqk7\" (UniqueName: \"kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-lpjsg\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.104492 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.546866 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg"] Feb 02 11:01:27 crc kubenswrapper[4901]: I0202 11:01:27.574116 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" event={"ID":"ab519d3b-c449-4008-b473-ad5e4e5d433e","Type":"ContainerStarted","Data":"0df37690a2a816aefc8f8fe14edb07266a7a547d3ea834a77e764fde21b9ed0f"} Feb 02 11:01:28 crc kubenswrapper[4901]: I0202 11:01:28.587672 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" event={"ID":"ab519d3b-c449-4008-b473-ad5e4e5d433e","Type":"ContainerStarted","Data":"c68627497e3b4088e48b6778745c7a8bcf0cd9f9fa7ae4b43bdc5ed96aeb3e8a"} Feb 02 11:01:28 crc kubenswrapper[4901]: I0202 11:01:28.615597 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" podStartSLOduration=2.156324311 podStartE2EDuration="2.615440446s" podCreationTimestamp="2026-02-02 11:01:26 +0000 UTC" firstStartedPulling="2026-02-02 11:01:27.544489627 +0000 UTC m=+1374.562829733" lastFinishedPulling="2026-02-02 11:01:28.003605742 +0000 UTC m=+1375.021945868" observedRunningTime="2026-02-02 11:01:28.609381478 +0000 UTC m=+1375.627721614" watchObservedRunningTime="2026-02-02 11:01:28.615440446 +0000 UTC m=+1375.633780542" Feb 02 11:01:31 crc kubenswrapper[4901]: I0202 11:01:31.630644 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab519d3b-c449-4008-b473-ad5e4e5d433e" containerID="c68627497e3b4088e48b6778745c7a8bcf0cd9f9fa7ae4b43bdc5ed96aeb3e8a" exitCode=0 Feb 02 11:01:31 crc kubenswrapper[4901]: I0202 11:01:31.630765 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" event={"ID":"ab519d3b-c449-4008-b473-ad5e4e5d433e","Type":"ContainerDied","Data":"c68627497e3b4088e48b6778745c7a8bcf0cd9f9fa7ae4b43bdc5ed96aeb3e8a"} Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.248653 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.376121 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dqk7\" (UniqueName: \"kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7\") pod \"ab519d3b-c449-4008-b473-ad5e4e5d433e\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.376231 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam\") pod \"ab519d3b-c449-4008-b473-ad5e4e5d433e\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.376271 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory\") pod \"ab519d3b-c449-4008-b473-ad5e4e5d433e\" (UID: \"ab519d3b-c449-4008-b473-ad5e4e5d433e\") " Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.383521 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7" (OuterVolumeSpecName: "kube-api-access-7dqk7") pod "ab519d3b-c449-4008-b473-ad5e4e5d433e" (UID: "ab519d3b-c449-4008-b473-ad5e4e5d433e"). InnerVolumeSpecName "kube-api-access-7dqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.406251 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab519d3b-c449-4008-b473-ad5e4e5d433e" (UID: "ab519d3b-c449-4008-b473-ad5e4e5d433e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.407935 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory" (OuterVolumeSpecName: "inventory") pod "ab519d3b-c449-4008-b473-ad5e4e5d433e" (UID: "ab519d3b-c449-4008-b473-ad5e4e5d433e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.478311 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dqk7\" (UniqueName: \"kubernetes.io/projected/ab519d3b-c449-4008-b473-ad5e4e5d433e-kube-api-access-7dqk7\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.478355 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.478369 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab519d3b-c449-4008-b473-ad5e4e5d433e-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.661044 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" event={"ID":"ab519d3b-c449-4008-b473-ad5e4e5d433e","Type":"ContainerDied","Data":"0df37690a2a816aefc8f8fe14edb07266a7a547d3ea834a77e764fde21b9ed0f"} Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.661435 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df37690a2a816aefc8f8fe14edb07266a7a547d3ea834a77e764fde21b9ed0f" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.661326 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-lpjsg" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.753956 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw"] Feb 02 11:01:33 crc kubenswrapper[4901]: E0202 11:01:33.754481 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab519d3b-c449-4008-b473-ad5e4e5d433e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.754506 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab519d3b-c449-4008-b473-ad5e4e5d433e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.754744 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab519d3b-c449-4008-b473-ad5e4e5d433e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.755547 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.760313 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.760395 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.760319 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.761126 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.769653 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw"] Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.888954 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.889354 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8qqz\" (UniqueName: \"kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.889899 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.890116 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.992821 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.992945 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8qqz\" (UniqueName: \"kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.993054 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:33 crc kubenswrapper[4901]: I0202 11:01:33.993109 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.005944 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.010208 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.012270 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.022434 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8qqz\" (UniqueName: \"kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.074876 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:01:34 crc kubenswrapper[4901]: I0202 11:01:34.666011 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw"] Feb 02 11:01:35 crc kubenswrapper[4901]: I0202 11:01:35.690887 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" event={"ID":"665b7c24-97eb-482c-9a0b-1492cfa2d84d","Type":"ContainerStarted","Data":"b65b72aacb2d6ba4e66701a50e10ff33cbc5a7e13eca0fbd2f40380b00011fdf"} Feb 02 11:01:35 crc kubenswrapper[4901]: I0202 11:01:35.691429 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" event={"ID":"665b7c24-97eb-482c-9a0b-1492cfa2d84d","Type":"ContainerStarted","Data":"a9d0622f8555563f8cf8533ada61bd26a5d78d21b203f6a184243dd23411d1eb"} Feb 02 11:01:35 crc kubenswrapper[4901]: I0202 11:01:35.719220 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" podStartSLOduration=2.21140898 podStartE2EDuration="2.719192349s" podCreationTimestamp="2026-02-02 11:01:33 +0000 UTC" firstStartedPulling="2026-02-02 11:01:34.66982351 +0000 UTC m=+1381.688163606" lastFinishedPulling="2026-02-02 11:01:35.177606879 +0000 UTC m=+1382.195946975" observedRunningTime="2026-02-02 11:01:35.715099848 +0000 UTC m=+1382.733439944" watchObservedRunningTime="2026-02-02 11:01:35.719192349 +0000 UTC m=+1382.737532465" Feb 02 11:01:40 crc kubenswrapper[4901]: I0202 11:01:40.028427 4901 scope.go:117] "RemoveContainer" containerID="eed61737ceb58cd57445bc8e10c8269d9bfc0454083e2f168c847199f7ed3f04" Feb 02 11:01:40 crc kubenswrapper[4901]: I0202 11:01:40.082214 4901 scope.go:117] "RemoveContainer" containerID="138826e28de0fe50677b0c7ee04b318274f4a421844ac049bd374b80c829a1ae" Feb 02 11:01:40 crc kubenswrapper[4901]: I0202 11:01:40.121877 4901 scope.go:117] "RemoveContainer" containerID="28a838a1a7c1be9d154c5fe909e3e315eec15bada39c52b38f07fc6bf2145944" Feb 02 11:02:07 crc kubenswrapper[4901]: I0202 11:02:07.838031 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:02:07 crc kubenswrapper[4901]: I0202 11:02:07.838707 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:02:37 crc kubenswrapper[4901]: I0202 11:02:37.838105 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:02:37 crc kubenswrapper[4901]: I0202 11:02:37.838852 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:02:40 crc kubenswrapper[4901]: I0202 11:02:40.305743 4901 scope.go:117] "RemoveContainer" containerID="c370f81bb014213fa8f0ca4747e7a27481b3e911d5436628a3ac6d65de5cab15" Feb 02 11:02:40 crc kubenswrapper[4901]: I0202 11:02:40.331953 4901 scope.go:117] "RemoveContainer" containerID="c719e4f31aa449f35965a9891a647d8f02ad348c38bfb40c74f4ae6985502644" Feb 02 11:02:40 crc kubenswrapper[4901]: I0202 11:02:40.372740 4901 scope.go:117] "RemoveContainer" containerID="a49c3f0e054068f2a9fb809af959494d324bc38d409d0642f47ec06aafdac2ad" Feb 02 11:03:07 crc kubenswrapper[4901]: I0202 11:03:07.837007 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:03:07 crc kubenswrapper[4901]: I0202 11:03:07.838688 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:03:07 crc kubenswrapper[4901]: I0202 11:03:07.838791 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:03:07 crc kubenswrapper[4901]: I0202 11:03:07.839778 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:03:07 crc kubenswrapper[4901]: I0202 11:03:07.839870 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff" gracePeriod=600 Feb 02 11:03:08 crc kubenswrapper[4901]: I0202 11:03:08.745964 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff" exitCode=0 Feb 02 11:03:08 crc kubenswrapper[4901]: I0202 11:03:08.746052 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff"} Feb 02 11:03:08 crc kubenswrapper[4901]: I0202 11:03:08.746642 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd"} Feb 02 11:03:08 crc kubenswrapper[4901]: I0202 11:03:08.746691 4901 scope.go:117] "RemoveContainer" containerID="f91b0b29e1b9fe682b552667293a0ed2a00f9d11128e3bc32e71c40f28a9f231" Feb 02 11:03:40 crc kubenswrapper[4901]: I0202 11:03:40.456307 4901 scope.go:117] "RemoveContainer" containerID="31145ddf5c2b8a55b22db930db9ae275fd40ebe87f6da75bfe738943ad0660a7" Feb 02 11:03:40 crc kubenswrapper[4901]: I0202 11:03:40.489066 4901 scope.go:117] "RemoveContainer" containerID="007aa571a93db60a7baf10afed2a0b4aba84468020f47a78b334dba3f481bcaa" Feb 02 11:03:40 crc kubenswrapper[4901]: I0202 11:03:40.517104 4901 scope.go:117] "RemoveContainer" containerID="137cd9c17ee4ac57c1c302e50c86191b1f409a6540089c72fce7948326623312" Feb 02 11:03:40 crc kubenswrapper[4901]: I0202 11:03:40.535458 4901 scope.go:117] "RemoveContainer" containerID="755476eee288b322d93ba27f8c554fff2456a8dc59e911a29943ca0ee4d6f28c" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.689432 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.693209 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.695180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.843821 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwq2c\" (UniqueName: \"kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.844283 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.844332 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.946765 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.946861 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwq2c\" (UniqueName: \"kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.946985 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.947317 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.947414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:57 crc kubenswrapper[4901]: I0202 11:03:57.969946 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwq2c\" (UniqueName: \"kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c\") pod \"community-operators-bt95d\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:58 crc kubenswrapper[4901]: I0202 11:03:58.021797 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:03:58 crc kubenswrapper[4901]: I0202 11:03:58.582623 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:03:59 crc kubenswrapper[4901]: I0202 11:03:59.355439 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerDied","Data":"2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6"} Feb 02 11:03:59 crc kubenswrapper[4901]: I0202 11:03:59.355321 4901 generic.go:334] "Generic (PLEG): container finished" podID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerID="2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6" exitCode=0 Feb 02 11:03:59 crc kubenswrapper[4901]: I0202 11:03:59.356043 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerStarted","Data":"ceb4647867da0a7c3952c5fecc915199061547f701d63bb4813206bdf907def3"} Feb 02 11:04:00 crc kubenswrapper[4901]: I0202 11:04:00.368596 4901 generic.go:334] "Generic (PLEG): container finished" podID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerID="1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738" exitCode=0 Feb 02 11:04:00 crc kubenswrapper[4901]: I0202 11:04:00.368697 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerDied","Data":"1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738"} Feb 02 11:04:01 crc kubenswrapper[4901]: I0202 11:04:01.381394 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerStarted","Data":"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e"} Feb 02 11:04:01 crc kubenswrapper[4901]: I0202 11:04:01.402333 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bt95d" podStartSLOduration=2.96079604 podStartE2EDuration="4.402311117s" podCreationTimestamp="2026-02-02 11:03:57 +0000 UTC" firstStartedPulling="2026-02-02 11:03:59.357896626 +0000 UTC m=+1526.376236722" lastFinishedPulling="2026-02-02 11:04:00.799411693 +0000 UTC m=+1527.817751799" observedRunningTime="2026-02-02 11:04:01.40041369 +0000 UTC m=+1528.418753796" watchObservedRunningTime="2026-02-02 11:04:01.402311117 +0000 UTC m=+1528.420651213" Feb 02 11:04:08 crc kubenswrapper[4901]: I0202 11:04:08.022110 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:08 crc kubenswrapper[4901]: I0202 11:04:08.023069 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:08 crc kubenswrapper[4901]: I0202 11:04:08.104051 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:08 crc kubenswrapper[4901]: I0202 11:04:08.786628 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:08 crc kubenswrapper[4901]: I0202 11:04:08.872118 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:04:10 crc kubenswrapper[4901]: I0202 11:04:10.735311 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bt95d" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="registry-server" containerID="cri-o://bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e" gracePeriod=2 Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.273886 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.365558 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities\") pod \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.365840 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content\") pod \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.366097 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwq2c\" (UniqueName: \"kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c\") pod \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\" (UID: \"7ad9b4a8-c941-474a-8ae1-66d8095d052e\") " Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.366914 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities" (OuterVolumeSpecName: "utilities") pod "7ad9b4a8-c941-474a-8ae1-66d8095d052e" (UID: "7ad9b4a8-c941-474a-8ae1-66d8095d052e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.367877 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.375808 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c" (OuterVolumeSpecName: "kube-api-access-wwq2c") pod "7ad9b4a8-c941-474a-8ae1-66d8095d052e" (UID: "7ad9b4a8-c941-474a-8ae1-66d8095d052e"). InnerVolumeSpecName "kube-api-access-wwq2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.421795 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad9b4a8-c941-474a-8ae1-66d8095d052e" (UID: "7ad9b4a8-c941-474a-8ae1-66d8095d052e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.470608 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b4a8-c941-474a-8ae1-66d8095d052e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.470659 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwq2c\" (UniqueName: \"kubernetes.io/projected/7ad9b4a8-c941-474a-8ae1-66d8095d052e-kube-api-access-wwq2c\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.754326 4901 generic.go:334] "Generic (PLEG): container finished" podID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerID="bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e" exitCode=0 Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.754406 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerDied","Data":"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e"} Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.754449 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bt95d" event={"ID":"7ad9b4a8-c941-474a-8ae1-66d8095d052e","Type":"ContainerDied","Data":"ceb4647867da0a7c3952c5fecc915199061547f701d63bb4813206bdf907def3"} Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.754482 4901 scope.go:117] "RemoveContainer" containerID="bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.754737 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bt95d" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.797552 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.803195 4901 scope.go:117] "RemoveContainer" containerID="1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.809345 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bt95d"] Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.854727 4901 scope.go:117] "RemoveContainer" containerID="2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.892602 4901 scope.go:117] "RemoveContainer" containerID="bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e" Feb 02 11:04:11 crc kubenswrapper[4901]: E0202 11:04:11.893424 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e\": container with ID starting with bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e not found: ID does not exist" containerID="bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.893509 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e"} err="failed to get container status \"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e\": rpc error: code = NotFound desc = could not find container \"bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e\": container with ID starting with bfcaa78a70ac105bc3293fd83bb87a95f3bc84e02b724ece455658cd04ac2b4e not found: ID does not exist" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.893555 4901 scope.go:117] "RemoveContainer" containerID="1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738" Feb 02 11:04:11 crc kubenswrapper[4901]: E0202 11:04:11.894242 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738\": container with ID starting with 1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738 not found: ID does not exist" containerID="1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.894308 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738"} err="failed to get container status \"1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738\": rpc error: code = NotFound desc = could not find container \"1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738\": container with ID starting with 1486c09c5429afbf22519334993cdfdd2f3eda3229bdbdb606e1e1ca63a53738 not found: ID does not exist" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.894360 4901 scope.go:117] "RemoveContainer" containerID="2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6" Feb 02 11:04:11 crc kubenswrapper[4901]: E0202 11:04:11.894779 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6\": container with ID starting with 2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6 not found: ID does not exist" containerID="2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6" Feb 02 11:04:11 crc kubenswrapper[4901]: I0202 11:04:11.894813 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6"} err="failed to get container status \"2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6\": rpc error: code = NotFound desc = could not find container \"2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6\": container with ID starting with 2129fb44d78dcf889589404bb9623eabac73e381913770d68adbb0305faa47a6 not found: ID does not exist" Feb 02 11:04:13 crc kubenswrapper[4901]: I0202 11:04:13.693241 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" path="/var/lib/kubelet/pods/7ad9b4a8-c941-474a-8ae1-66d8095d052e/volumes" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.443988 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:20 crc kubenswrapper[4901]: E0202 11:04:20.444989 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="registry-server" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.445004 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="registry-server" Feb 02 11:04:20 crc kubenswrapper[4901]: E0202 11:04:20.445024 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="extract-utilities" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.445032 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="extract-utilities" Feb 02 11:04:20 crc kubenswrapper[4901]: E0202 11:04:20.445061 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="extract-content" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.445069 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="extract-content" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.445282 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad9b4a8-c941-474a-8ae1-66d8095d052e" containerName="registry-server" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.446835 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.478411 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.517675 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.517759 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.517792 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfh9f\" (UniqueName: \"kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.620326 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.620446 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.620495 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfh9f\" (UniqueName: \"kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.621001 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.621023 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.665855 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfh9f\" (UniqueName: \"kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f\") pod \"certified-operators-j27tp\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:20 crc kubenswrapper[4901]: I0202 11:04:20.797414 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:21 crc kubenswrapper[4901]: I0202 11:04:21.138978 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:21 crc kubenswrapper[4901]: I0202 11:04:21.894029 4901 generic.go:334] "Generic (PLEG): container finished" podID="a4239df5-fba8-49f9-901b-3e746af65e38" containerID="c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708" exitCode=0 Feb 02 11:04:21 crc kubenswrapper[4901]: I0202 11:04:21.894089 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerDied","Data":"c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708"} Feb 02 11:04:21 crc kubenswrapper[4901]: I0202 11:04:21.894124 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerStarted","Data":"ca8f0231f8b3c7ba47e919faf6bfc4e529c560c580d47cf8ec498e3b5132f7cd"} Feb 02 11:04:21 crc kubenswrapper[4901]: I0202 11:04:21.897645 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:04:22 crc kubenswrapper[4901]: I0202 11:04:22.904879 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerStarted","Data":"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481"} Feb 02 11:04:23 crc kubenswrapper[4901]: I0202 11:04:23.917734 4901 generic.go:334] "Generic (PLEG): container finished" podID="a4239df5-fba8-49f9-901b-3e746af65e38" containerID="946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481" exitCode=0 Feb 02 11:04:23 crc kubenswrapper[4901]: I0202 11:04:23.917969 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerDied","Data":"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481"} Feb 02 11:04:24 crc kubenswrapper[4901]: I0202 11:04:24.931197 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerStarted","Data":"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312"} Feb 02 11:04:24 crc kubenswrapper[4901]: I0202 11:04:24.964175 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j27tp" podStartSLOduration=2.503492196 podStartE2EDuration="4.964150643s" podCreationTimestamp="2026-02-02 11:04:20 +0000 UTC" firstStartedPulling="2026-02-02 11:04:21.89722462 +0000 UTC m=+1548.915564716" lastFinishedPulling="2026-02-02 11:04:24.357883047 +0000 UTC m=+1551.376223163" observedRunningTime="2026-02-02 11:04:24.955293215 +0000 UTC m=+1551.973633351" watchObservedRunningTime="2026-02-02 11:04:24.964150643 +0000 UTC m=+1551.982490749" Feb 02 11:04:30 crc kubenswrapper[4901]: I0202 11:04:30.798341 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:30 crc kubenswrapper[4901]: I0202 11:04:30.799424 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:30 crc kubenswrapper[4901]: I0202 11:04:30.873243 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:31 crc kubenswrapper[4901]: I0202 11:04:31.042496 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:31 crc kubenswrapper[4901]: I0202 11:04:31.163080 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.014269 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j27tp" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="registry-server" containerID="cri-o://e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312" gracePeriod=2 Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.500892 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.654683 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content\") pod \"a4239df5-fba8-49f9-901b-3e746af65e38\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.654745 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfh9f\" (UniqueName: \"kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f\") pod \"a4239df5-fba8-49f9-901b-3e746af65e38\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.655023 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities\") pod \"a4239df5-fba8-49f9-901b-3e746af65e38\" (UID: \"a4239df5-fba8-49f9-901b-3e746af65e38\") " Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.656624 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities" (OuterVolumeSpecName: "utilities") pod "a4239df5-fba8-49f9-901b-3e746af65e38" (UID: "a4239df5-fba8-49f9-901b-3e746af65e38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.660786 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f" (OuterVolumeSpecName: "kube-api-access-xfh9f") pod "a4239df5-fba8-49f9-901b-3e746af65e38" (UID: "a4239df5-fba8-49f9-901b-3e746af65e38"). InnerVolumeSpecName "kube-api-access-xfh9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.706120 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4239df5-fba8-49f9-901b-3e746af65e38" (UID: "a4239df5-fba8-49f9-901b-3e746af65e38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.757301 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.757336 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfh9f\" (UniqueName: \"kubernetes.io/projected/a4239df5-fba8-49f9-901b-3e746af65e38-kube-api-access-xfh9f\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:33 crc kubenswrapper[4901]: I0202 11:04:33.757347 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4239df5-fba8-49f9-901b-3e746af65e38-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.026204 4901 generic.go:334] "Generic (PLEG): container finished" podID="a4239df5-fba8-49f9-901b-3e746af65e38" containerID="e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312" exitCode=0 Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.026272 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j27tp" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.026295 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerDied","Data":"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312"} Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.028188 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j27tp" event={"ID":"a4239df5-fba8-49f9-901b-3e746af65e38","Type":"ContainerDied","Data":"ca8f0231f8b3c7ba47e919faf6bfc4e529c560c580d47cf8ec498e3b5132f7cd"} Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.028210 4901 scope.go:117] "RemoveContainer" containerID="e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.072053 4901 scope.go:117] "RemoveContainer" containerID="946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.081341 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.092891 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j27tp"] Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.104497 4901 scope.go:117] "RemoveContainer" containerID="c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.162030 4901 scope.go:117] "RemoveContainer" containerID="e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312" Feb 02 11:04:34 crc kubenswrapper[4901]: E0202 11:04:34.162398 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312\": container with ID starting with e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312 not found: ID does not exist" containerID="e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.162441 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312"} err="failed to get container status \"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312\": rpc error: code = NotFound desc = could not find container \"e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312\": container with ID starting with e760597e70d1731760663d4477ea519e79a5dd92a97cb99223c53a8c40276312 not found: ID does not exist" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.162470 4901 scope.go:117] "RemoveContainer" containerID="946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481" Feb 02 11:04:34 crc kubenswrapper[4901]: E0202 11:04:34.162870 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481\": container with ID starting with 946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481 not found: ID does not exist" containerID="946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.162904 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481"} err="failed to get container status \"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481\": rpc error: code = NotFound desc = could not find container \"946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481\": container with ID starting with 946eb488e9fd005153cd47c1e4b1bf85106e4fec46c2a909515bce9ebab6f481 not found: ID does not exist" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.162924 4901 scope.go:117] "RemoveContainer" containerID="c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708" Feb 02 11:04:34 crc kubenswrapper[4901]: E0202 11:04:34.163193 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708\": container with ID starting with c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708 not found: ID does not exist" containerID="c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708" Feb 02 11:04:34 crc kubenswrapper[4901]: I0202 11:04:34.163213 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708"} err="failed to get container status \"c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708\": rpc error: code = NotFound desc = could not find container \"c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708\": container with ID starting with c314af489a6e780fa0e5bee3c3505f25cab22807c4372949af74c7b78bf2a708 not found: ID does not exist" Feb 02 11:04:35 crc kubenswrapper[4901]: I0202 11:04:35.693548 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" path="/var/lib/kubelet/pods/a4239df5-fba8-49f9-901b-3e746af65e38/volumes" Feb 02 11:04:40 crc kubenswrapper[4901]: I0202 11:04:40.095337 4901 generic.go:334] "Generic (PLEG): container finished" podID="665b7c24-97eb-482c-9a0b-1492cfa2d84d" containerID="b65b72aacb2d6ba4e66701a50e10ff33cbc5a7e13eca0fbd2f40380b00011fdf" exitCode=0 Feb 02 11:04:40 crc kubenswrapper[4901]: I0202 11:04:40.095476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" event={"ID":"665b7c24-97eb-482c-9a0b-1492cfa2d84d","Type":"ContainerDied","Data":"b65b72aacb2d6ba4e66701a50e10ff33cbc5a7e13eca0fbd2f40380b00011fdf"} Feb 02 11:04:40 crc kubenswrapper[4901]: I0202 11:04:40.612688 4901 scope.go:117] "RemoveContainer" containerID="05912d87d9bd02f1cc666a84703c2c5e0534df3ce47a1a7c4fdcccab3e79f089" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.600411 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.770309 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory\") pod \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.770734 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam\") pod \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.770828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8qqz\" (UniqueName: \"kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz\") pod \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.770885 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle\") pod \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\" (UID: \"665b7c24-97eb-482c-9a0b-1492cfa2d84d\") " Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.777964 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "665b7c24-97eb-482c-9a0b-1492cfa2d84d" (UID: "665b7c24-97eb-482c-9a0b-1492cfa2d84d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.778640 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz" (OuterVolumeSpecName: "kube-api-access-x8qqz") pod "665b7c24-97eb-482c-9a0b-1492cfa2d84d" (UID: "665b7c24-97eb-482c-9a0b-1492cfa2d84d"). InnerVolumeSpecName "kube-api-access-x8qqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.807847 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory" (OuterVolumeSpecName: "inventory") pod "665b7c24-97eb-482c-9a0b-1492cfa2d84d" (UID: "665b7c24-97eb-482c-9a0b-1492cfa2d84d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.810294 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "665b7c24-97eb-482c-9a0b-1492cfa2d84d" (UID: "665b7c24-97eb-482c-9a0b-1492cfa2d84d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.873419 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.873462 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8qqz\" (UniqueName: \"kubernetes.io/projected/665b7c24-97eb-482c-9a0b-1492cfa2d84d-kube-api-access-x8qqz\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.873475 4901 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:41 crc kubenswrapper[4901]: I0202 11:04:41.873487 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/665b7c24-97eb-482c-9a0b-1492cfa2d84d-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.122956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" event={"ID":"665b7c24-97eb-482c-9a0b-1492cfa2d84d","Type":"ContainerDied","Data":"a9d0622f8555563f8cf8533ada61bd26a5d78d21b203f6a184243dd23411d1eb"} Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.123019 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d0622f8555563f8cf8533ada61bd26a5d78d21b203f6a184243dd23411d1eb" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.123060 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.237240 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b"] Feb 02 11:04:42 crc kubenswrapper[4901]: E0202 11:04:42.237763 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="registry-server" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.237788 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="registry-server" Feb 02 11:04:42 crc kubenswrapper[4901]: E0202 11:04:42.237829 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="extract-content" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.237839 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="extract-content" Feb 02 11:04:42 crc kubenswrapper[4901]: E0202 11:04:42.237866 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="extract-utilities" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.237874 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="extract-utilities" Feb 02 11:04:42 crc kubenswrapper[4901]: E0202 11:04:42.237888 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665b7c24-97eb-482c-9a0b-1492cfa2d84d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.237897 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="665b7c24-97eb-482c-9a0b-1492cfa2d84d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.238126 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="665b7c24-97eb-482c-9a0b-1492cfa2d84d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.238157 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4239df5-fba8-49f9-901b-3e746af65e38" containerName="registry-server" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.238995 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.243650 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.243715 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.243799 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.243832 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.254209 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b"] Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.386493 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzl74\" (UniqueName: \"kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.386635 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.387212 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.489633 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.489711 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzl74\" (UniqueName: \"kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.489833 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.495659 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.495935 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.519733 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzl74\" (UniqueName: \"kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:42 crc kubenswrapper[4901]: I0202 11:04:42.561550 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:04:43 crc kubenswrapper[4901]: I0202 11:04:43.175302 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b"] Feb 02 11:04:43 crc kubenswrapper[4901]: W0202 11:04:43.183942 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e499a01_9c9c_44ca_a1d2_e35912fba103.slice/crio-5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec WatchSource:0}: Error finding container 5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec: Status 404 returned error can't find the container with id 5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec Feb 02 11:04:44 crc kubenswrapper[4901]: I0202 11:04:44.150508 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" event={"ID":"7e499a01-9c9c-44ca-a1d2-e35912fba103","Type":"ContainerStarted","Data":"61583392abc18254623a9540feac9d6111c1df32c8a1bb4059cd2d341ad7c760"} Feb 02 11:04:44 crc kubenswrapper[4901]: I0202 11:04:44.150999 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" event={"ID":"7e499a01-9c9c-44ca-a1d2-e35912fba103","Type":"ContainerStarted","Data":"5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec"} Feb 02 11:04:44 crc kubenswrapper[4901]: I0202 11:04:44.182632 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" podStartSLOduration=1.6474900670000001 podStartE2EDuration="2.18260181s" podCreationTimestamp="2026-02-02 11:04:42 +0000 UTC" firstStartedPulling="2026-02-02 11:04:43.187627951 +0000 UTC m=+1570.205968047" lastFinishedPulling="2026-02-02 11:04:43.722739674 +0000 UTC m=+1570.741079790" observedRunningTime="2026-02-02 11:04:44.171921041 +0000 UTC m=+1571.190261137" watchObservedRunningTime="2026-02-02 11:04:44.18260181 +0000 UTC m=+1571.200941906" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.290928 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.295776 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.349436 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.405464 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.405986 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2h7x\" (UniqueName: \"kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.406144 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.508175 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2h7x\" (UniqueName: \"kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.508579 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.508733 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.509207 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.509289 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.540314 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2h7x\" (UniqueName: \"kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x\") pod \"redhat-operators-q7rrr\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:01 crc kubenswrapper[4901]: I0202 11:05:01.661144 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:02 crc kubenswrapper[4901]: I0202 11:05:02.182058 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:02 crc kubenswrapper[4901]: W0202 11:05:02.195517 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a28582_bf50_4a99_9590_c25370869096.slice/crio-c70fde43949b4627d014c92b53ad82ab9935b65c66dd794ea4c34f5654810816 WatchSource:0}: Error finding container c70fde43949b4627d014c92b53ad82ab9935b65c66dd794ea4c34f5654810816: Status 404 returned error can't find the container with id c70fde43949b4627d014c92b53ad82ab9935b65c66dd794ea4c34f5654810816 Feb 02 11:05:02 crc kubenswrapper[4901]: I0202 11:05:02.362487 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerStarted","Data":"c70fde43949b4627d014c92b53ad82ab9935b65c66dd794ea4c34f5654810816"} Feb 02 11:05:03 crc kubenswrapper[4901]: I0202 11:05:03.376456 4901 generic.go:334] "Generic (PLEG): container finished" podID="88a28582-bf50-4a99-9590-c25370869096" containerID="18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a" exitCode=0 Feb 02 11:05:03 crc kubenswrapper[4901]: I0202 11:05:03.376536 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerDied","Data":"18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a"} Feb 02 11:05:05 crc kubenswrapper[4901]: I0202 11:05:05.409341 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerStarted","Data":"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141"} Feb 02 11:05:05 crc kubenswrapper[4901]: E0202 11:05:05.695491 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a28582_bf50_4a99_9590_c25370869096.slice/crio-conmon-ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a28582_bf50_4a99_9590_c25370869096.slice/crio-ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:05:06 crc kubenswrapper[4901]: I0202 11:05:06.425174 4901 generic.go:334] "Generic (PLEG): container finished" podID="88a28582-bf50-4a99-9590-c25370869096" containerID="ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141" exitCode=0 Feb 02 11:05:06 crc kubenswrapper[4901]: I0202 11:05:06.425267 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerDied","Data":"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141"} Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.042859 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6xc42"] Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.054089 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2ac1-account-create-update-q6b8n"] Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.066158 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6xc42"] Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.076990 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2ac1-account-create-update-q6b8n"] Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.438757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerStarted","Data":"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f"} Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.464859 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q7rrr" podStartSLOduration=3.034656451 podStartE2EDuration="6.464833469s" podCreationTimestamp="2026-02-02 11:05:01 +0000 UTC" firstStartedPulling="2026-02-02 11:05:03.384883727 +0000 UTC m=+1590.403223833" lastFinishedPulling="2026-02-02 11:05:06.815060745 +0000 UTC m=+1593.833400851" observedRunningTime="2026-02-02 11:05:07.462213547 +0000 UTC m=+1594.480553653" watchObservedRunningTime="2026-02-02 11:05:07.464833469 +0000 UTC m=+1594.483173565" Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.690214 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61290422-4438-485f-beaa-27fbcfdb1ea2" path="/var/lib/kubelet/pods/61290422-4438-485f-beaa-27fbcfdb1ea2/volumes" Feb 02 11:05:07 crc kubenswrapper[4901]: I0202 11:05:07.691470 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a8fa42-4cac-4605-969b-7f5a2e55d7ad" path="/var/lib/kubelet/pods/76a8fa42-4cac-4605-969b-7f5a2e55d7ad/volumes" Feb 02 11:05:11 crc kubenswrapper[4901]: I0202 11:05:11.041143 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9c55-account-create-update-x27xj"] Feb 02 11:05:11 crc kubenswrapper[4901]: I0202 11:05:11.053812 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9c55-account-create-update-x27xj"] Feb 02 11:05:11 crc kubenswrapper[4901]: I0202 11:05:11.661871 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:11 crc kubenswrapper[4901]: I0202 11:05:11.662107 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:11 crc kubenswrapper[4901]: I0202 11:05:11.691884 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6105ae6c-7921-41a9-ad59-2b43c6ab77ed" path="/var/lib/kubelet/pods/6105ae6c-7921-41a9-ad59-2b43c6ab77ed/volumes" Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.065835 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zvnzt"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.076742 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-z99sf"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.088317 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-z99sf"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.098936 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zvnzt"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.107970 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b701-account-create-update-4l6hg"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.117492 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b701-account-create-update-4l6hg"] Feb 02 11:05:12 crc kubenswrapper[4901]: I0202 11:05:12.726410 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q7rrr" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="registry-server" probeResult="failure" output=< Feb 02 11:05:12 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:05:12 crc kubenswrapper[4901]: > Feb 02 11:05:13 crc kubenswrapper[4901]: I0202 11:05:13.700155 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dcbca9e-ef0b-443e-9594-95dfee5ab743" path="/var/lib/kubelet/pods/2dcbca9e-ef0b-443e-9594-95dfee5ab743/volumes" Feb 02 11:05:13 crc kubenswrapper[4901]: I0202 11:05:13.702609 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5064f5-9947-45f6-9f83-97dcdbfbc466" path="/var/lib/kubelet/pods/5c5064f5-9947-45f6-9f83-97dcdbfbc466/volumes" Feb 02 11:05:13 crc kubenswrapper[4901]: I0202 11:05:13.704045 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d896dd47-8297-41aa-bd3e-9a42a936b474" path="/var/lib/kubelet/pods/d896dd47-8297-41aa-bd3e-9a42a936b474/volumes" Feb 02 11:05:21 crc kubenswrapper[4901]: I0202 11:05:21.758301 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:21 crc kubenswrapper[4901]: I0202 11:05:21.849033 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:22 crc kubenswrapper[4901]: I0202 11:05:22.023761 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:23 crc kubenswrapper[4901]: I0202 11:05:23.639226 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q7rrr" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="registry-server" containerID="cri-o://6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f" gracePeriod=2 Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.285321 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.396716 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities\") pod \"88a28582-bf50-4a99-9590-c25370869096\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.396956 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2h7x\" (UniqueName: \"kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x\") pod \"88a28582-bf50-4a99-9590-c25370869096\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.397100 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content\") pod \"88a28582-bf50-4a99-9590-c25370869096\" (UID: \"88a28582-bf50-4a99-9590-c25370869096\") " Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.399102 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities" (OuterVolumeSpecName: "utilities") pod "88a28582-bf50-4a99-9590-c25370869096" (UID: "88a28582-bf50-4a99-9590-c25370869096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.405005 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x" (OuterVolumeSpecName: "kube-api-access-x2h7x") pod "88a28582-bf50-4a99-9590-c25370869096" (UID: "88a28582-bf50-4a99-9590-c25370869096"). InnerVolumeSpecName "kube-api-access-x2h7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.500382 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.500433 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2h7x\" (UniqueName: \"kubernetes.io/projected/88a28582-bf50-4a99-9590-c25370869096-kube-api-access-x2h7x\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.572458 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88a28582-bf50-4a99-9590-c25370869096" (UID: "88a28582-bf50-4a99-9590-c25370869096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.603546 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a28582-bf50-4a99-9590-c25370869096-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.655699 4901 generic.go:334] "Generic (PLEG): container finished" podID="88a28582-bf50-4a99-9590-c25370869096" containerID="6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f" exitCode=0 Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.655764 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerDied","Data":"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f"} Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.655839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7rrr" event={"ID":"88a28582-bf50-4a99-9590-c25370869096","Type":"ContainerDied","Data":"c70fde43949b4627d014c92b53ad82ab9935b65c66dd794ea4c34f5654810816"} Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.655877 4901 scope.go:117] "RemoveContainer" containerID="6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.655868 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7rrr" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.716349 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.718551 4901 scope.go:117] "RemoveContainer" containerID="ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.728778 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q7rrr"] Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.752214 4901 scope.go:117] "RemoveContainer" containerID="18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.794884 4901 scope.go:117] "RemoveContainer" containerID="6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f" Feb 02 11:05:24 crc kubenswrapper[4901]: E0202 11:05:24.795733 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f\": container with ID starting with 6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f not found: ID does not exist" containerID="6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.795807 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f"} err="failed to get container status \"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f\": rpc error: code = NotFound desc = could not find container \"6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f\": container with ID starting with 6bbb3dd54e2255ecd1c663904b9db9eccb8b34a99ebb3c70f98bc5c3d248899f not found: ID does not exist" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.795838 4901 scope.go:117] "RemoveContainer" containerID="ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141" Feb 02 11:05:24 crc kubenswrapper[4901]: E0202 11:05:24.796506 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141\": container with ID starting with ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141 not found: ID does not exist" containerID="ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.796588 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141"} err="failed to get container status \"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141\": rpc error: code = NotFound desc = could not find container \"ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141\": container with ID starting with ef70c704cab2da5e478b2dbbc1af290854edafba2af31a368ec6d785d0dca141 not found: ID does not exist" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.796631 4901 scope.go:117] "RemoveContainer" containerID="18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a" Feb 02 11:05:24 crc kubenswrapper[4901]: E0202 11:05:24.797144 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a\": container with ID starting with 18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a not found: ID does not exist" containerID="18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a" Feb 02 11:05:24 crc kubenswrapper[4901]: I0202 11:05:24.797186 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a"} err="failed to get container status \"18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a\": rpc error: code = NotFound desc = could not find container \"18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a\": container with ID starting with 18040c47ff85b204a81f4a07d2ec836f47c09327a595db2bcdb610a7201f0e4a not found: ID does not exist" Feb 02 11:05:25 crc kubenswrapper[4901]: I0202 11:05:25.702111 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a28582-bf50-4a99-9590-c25370869096" path="/var/lib/kubelet/pods/88a28582-bf50-4a99-9590-c25370869096/volumes" Feb 02 11:05:27 crc kubenswrapper[4901]: I0202 11:05:27.063264 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lrthz"] Feb 02 11:05:27 crc kubenswrapper[4901]: I0202 11:05:27.078380 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lrthz"] Feb 02 11:05:27 crc kubenswrapper[4901]: I0202 11:05:27.693696 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9aee1d-518e-46d9-8d21-54ea5018453d" path="/var/lib/kubelet/pods/8b9aee1d-518e-46d9-8d21-54ea5018453d/volumes" Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.052737 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-q7rkr"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.074698 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-12b8-account-create-update-znw9c"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.088706 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-srtxb"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.098409 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-8ca5-account-create-update-h5xln"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.110199 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ad2-account-create-update-r98jn"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.120350 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-8ca5-account-create-update-h5xln"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.133202 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-q7rkr"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.147642 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-12b8-account-create-update-znw9c"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.160106 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-srtxb"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.170847 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ad2-account-create-update-r98jn"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.180508 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8tpf5"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.185617 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9326-account-create-update-fxmwt"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.194344 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-66nlg"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.202636 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8tpf5"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.210539 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-66nlg"] Feb 02 11:05:34 crc kubenswrapper[4901]: I0202 11:05:34.218113 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9326-account-create-update-fxmwt"] Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.699464 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a824c3-d0b9-482d-8f69-0431b9b46f85" path="/var/lib/kubelet/pods/14a824c3-d0b9-482d-8f69-0431b9b46f85/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.700083 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28588193-ae95-4ea4-a449-614bf3beebc2" path="/var/lib/kubelet/pods/28588193-ae95-4ea4-a449-614bf3beebc2/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.700663 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220" path="/var/lib/kubelet/pods/55c3f4b9-7f8e-4a8a-afb2-ffc84ece9220/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.701694 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b6486b-a65c-468f-a6fa-4fc229447300" path="/var/lib/kubelet/pods/72b6486b-a65c-468f-a6fa-4fc229447300/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.702224 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73fb6dad-c6af-44e7-98a9-fe46e2ea41e5" path="/var/lib/kubelet/pods/73fb6dad-c6af-44e7-98a9-fe46e2ea41e5/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.702760 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8879f11a-677c-485b-91dd-082f86fd8d5a" path="/var/lib/kubelet/pods/8879f11a-677c-485b-91dd-082f86fd8d5a/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.703786 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ffd424-d5ce-4321-adc1-638508596191" path="/var/lib/kubelet/pods/98ffd424-d5ce-4321-adc1-638508596191/volumes" Feb 02 11:05:35 crc kubenswrapper[4901]: I0202 11:05:35.704298 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf53bca-c9cf-4a55-a7b1-fc983eef2c74" path="/var/lib/kubelet/pods/cdf53bca-c9cf-4a55-a7b1-fc983eef2c74/volumes" Feb 02 11:05:36 crc kubenswrapper[4901]: I0202 11:05:36.045039 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-l8vxk"] Feb 02 11:05:36 crc kubenswrapper[4901]: I0202 11:05:36.061482 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-l8vxk"] Feb 02 11:05:37 crc kubenswrapper[4901]: I0202 11:05:37.707263 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b582a5-a4bf-4c36-974a-0cf96389bb90" path="/var/lib/kubelet/pods/06b582a5-a4bf-4c36-974a-0cf96389bb90/volumes" Feb 02 11:05:37 crc kubenswrapper[4901]: I0202 11:05:37.838198 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:05:37 crc kubenswrapper[4901]: I0202 11:05:37.838353 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.063775 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xr6pb"] Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.083909 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xr6pb"] Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.745478 4901 scope.go:117] "RemoveContainer" containerID="63ca4ba074a3da8f7885a0b965673af5555913f8f8b59264072a0be11c5cdc09" Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.792217 4901 scope.go:117] "RemoveContainer" containerID="93d552bb5867aac3aa492a0e0cc218e0bbbdbcbdf9dc0d1c4070b63471894334" Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.842900 4901 scope.go:117] "RemoveContainer" containerID="e5da4a708e2b1c5b8df94581e2c6eb489e56e39b519928b1c85f4125397aabc0" Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.895652 4901 scope.go:117] "RemoveContainer" containerID="91687dee59c8e5182e1d950ff7fab2790b778116b2f52d461608ad1576841234" Feb 02 11:05:40 crc kubenswrapper[4901]: I0202 11:05:40.967557 4901 scope.go:117] "RemoveContainer" containerID="d630bd8f46cda54a28d67df0152b438f7b7d25ea08f5907505935c6fc4db98cd" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.001731 4901 scope.go:117] "RemoveContainer" containerID="b6efe5eb1d61bb0403e2b600ead2829766e21d0d782223ed4a922a3964a5dc36" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.041316 4901 scope.go:117] "RemoveContainer" containerID="3d0c34e881a6c7fb959cc322dda4334b9da3832652549750ea3182ee2ea0363f" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.069987 4901 scope.go:117] "RemoveContainer" containerID="f2555d25f9b9a661efeae92296056ddbdbb2475b7a261e776ed1ec9f9ce58f73" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.092426 4901 scope.go:117] "RemoveContainer" containerID="cb0d8fbd5fd41f6fa0208370a1053bbe27903c07087f945e7ce06ba428fef30f" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.119223 4901 scope.go:117] "RemoveContainer" containerID="36ba7129012ac21b0f824cec190da85f00e005f11f14e8512eee43d12378053b" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.153748 4901 scope.go:117] "RemoveContainer" containerID="ec14d351eb433ac7609c09477faca1a52133f666083c3d67ca158107a31d2bfd" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.191019 4901 scope.go:117] "RemoveContainer" containerID="785dccd879c42a6854d2f1984c9a3eab6a684189e4071cb276e88237ca017987" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.224501 4901 scope.go:117] "RemoveContainer" containerID="8d6f77b0d29e2378e46e200a1521ba81590822372a570e56bae012b8efc8a22b" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.249298 4901 scope.go:117] "RemoveContainer" containerID="44fa25142ac132a908fa1ce6f91507673fe357e08065ca786649d64077375802" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.274672 4901 scope.go:117] "RemoveContainer" containerID="aaf7bc852cfecd9b709247a307a638e69d2d1bcd79f1c39979e4323aed10be70" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.303896 4901 scope.go:117] "RemoveContainer" containerID="87a8b12378b6a38b8b5b8a1a70f98ca49c41fda10dbfa2f7ece5f88449ffe61f" Feb 02 11:05:41 crc kubenswrapper[4901]: I0202 11:05:41.696629 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a766edf3-78cd-4939-a63b-8079e261b386" path="/var/lib/kubelet/pods/a766edf3-78cd-4939-a63b-8079e261b386/volumes" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.658996 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:05:53 crc kubenswrapper[4901]: E0202 11:05:53.660556 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="extract-content" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.660595 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="extract-content" Feb 02 11:05:53 crc kubenswrapper[4901]: E0202 11:05:53.660619 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="extract-utilities" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.660630 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="extract-utilities" Feb 02 11:05:53 crc kubenswrapper[4901]: E0202 11:05:53.660643 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="registry-server" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.660652 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="registry-server" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.660924 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a28582-bf50-4a99-9590-c25370869096" containerName="registry-server" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.663668 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.732129 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.770243 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxk67\" (UniqueName: \"kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.770598 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.770837 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.873122 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.873178 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxk67\" (UniqueName: \"kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.873214 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.873748 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.873984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:53 crc kubenswrapper[4901]: I0202 11:05:53.896157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxk67\" (UniqueName: \"kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67\") pod \"redhat-marketplace-gj2v7\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:54 crc kubenswrapper[4901]: I0202 11:05:54.045820 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:05:54 crc kubenswrapper[4901]: I0202 11:05:54.366155 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:05:55 crc kubenswrapper[4901]: I0202 11:05:55.104316 4901 generic.go:334] "Generic (PLEG): container finished" podID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerID="589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202" exitCode=0 Feb 02 11:05:55 crc kubenswrapper[4901]: I0202 11:05:55.104731 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerDied","Data":"589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202"} Feb 02 11:05:55 crc kubenswrapper[4901]: I0202 11:05:55.105353 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerStarted","Data":"cb53515a12a2ed5a3b7522f24d20bbaf5c0bbe66ce24a5b03c90249b8b4c896c"} Feb 02 11:05:56 crc kubenswrapper[4901]: I0202 11:05:56.123228 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerStarted","Data":"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae"} Feb 02 11:05:57 crc kubenswrapper[4901]: I0202 11:05:57.142283 4901 generic.go:334] "Generic (PLEG): container finished" podID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerID="dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae" exitCode=0 Feb 02 11:05:57 crc kubenswrapper[4901]: I0202 11:05:57.142377 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerDied","Data":"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae"} Feb 02 11:05:57 crc kubenswrapper[4901]: I0202 11:05:57.144213 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerStarted","Data":"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b"} Feb 02 11:05:57 crc kubenswrapper[4901]: I0202 11:05:57.168254 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gj2v7" podStartSLOduration=2.7478795959999998 podStartE2EDuration="4.168229265s" podCreationTimestamp="2026-02-02 11:05:53 +0000 UTC" firstStartedPulling="2026-02-02 11:05:55.10922313 +0000 UTC m=+1642.127563266" lastFinishedPulling="2026-02-02 11:05:56.529572839 +0000 UTC m=+1643.547912935" observedRunningTime="2026-02-02 11:05:57.16346382 +0000 UTC m=+1644.181803926" watchObservedRunningTime="2026-02-02 11:05:57.168229265 +0000 UTC m=+1644.186569361" Feb 02 11:06:04 crc kubenswrapper[4901]: I0202 11:06:04.046665 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:04 crc kubenswrapper[4901]: I0202 11:06:04.047774 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:04 crc kubenswrapper[4901]: I0202 11:06:04.114060 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:04 crc kubenswrapper[4901]: I0202 11:06:04.283706 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:04 crc kubenswrapper[4901]: I0202 11:06:04.360693 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.260298 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gj2v7" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="registry-server" containerID="cri-o://b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b" gracePeriod=2 Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.803213 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.895041 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxk67\" (UniqueName: \"kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67\") pod \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.895237 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content\") pod \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.895310 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities\") pod \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\" (UID: \"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3\") " Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.897112 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities" (OuterVolumeSpecName: "utilities") pod "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" (UID: "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.898684 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.905964 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67" (OuterVolumeSpecName: "kube-api-access-dxk67") pod "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" (UID: "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3"). InnerVolumeSpecName "kube-api-access-dxk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:06:06 crc kubenswrapper[4901]: I0202 11:06:06.928954 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" (UID: "fa2fc407-51d4-4d62-914c-f83ac5ba1fb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.000498 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.000540 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxk67\" (UniqueName: \"kubernetes.io/projected/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3-kube-api-access-dxk67\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.272784 4901 generic.go:334] "Generic (PLEG): container finished" podID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerID="b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b" exitCode=0 Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.272834 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerDied","Data":"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b"} Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.272869 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj2v7" event={"ID":"fa2fc407-51d4-4d62-914c-f83ac5ba1fb3","Type":"ContainerDied","Data":"cb53515a12a2ed5a3b7522f24d20bbaf5c0bbe66ce24a5b03c90249b8b4c896c"} Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.272891 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj2v7" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.272903 4901 scope.go:117] "RemoveContainer" containerID="b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.302727 4901 scope.go:117] "RemoveContainer" containerID="dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.325390 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.335816 4901 scope.go:117] "RemoveContainer" containerID="589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.338219 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj2v7"] Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.384186 4901 scope.go:117] "RemoveContainer" containerID="b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b" Feb 02 11:06:07 crc kubenswrapper[4901]: E0202 11:06:07.385224 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b\": container with ID starting with b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b not found: ID does not exist" containerID="b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.385273 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b"} err="failed to get container status \"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b\": rpc error: code = NotFound desc = could not find container \"b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b\": container with ID starting with b4dd83fd03cef2bac09d0d18a95a5057bf65f8c68fae93d5b1e387434561695b not found: ID does not exist" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.385305 4901 scope.go:117] "RemoveContainer" containerID="dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae" Feb 02 11:06:07 crc kubenswrapper[4901]: E0202 11:06:07.386367 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae\": container with ID starting with dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae not found: ID does not exist" containerID="dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.386429 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae"} err="failed to get container status \"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae\": rpc error: code = NotFound desc = could not find container \"dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae\": container with ID starting with dfe42d0957a88a3664cf413a5b9123811725fbace10ed7f4164c1d61f5c45aae not found: ID does not exist" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.386476 4901 scope.go:117] "RemoveContainer" containerID="589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202" Feb 02 11:06:07 crc kubenswrapper[4901]: E0202 11:06:07.387063 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202\": container with ID starting with 589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202 not found: ID does not exist" containerID="589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.387106 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202"} err="failed to get container status \"589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202\": rpc error: code = NotFound desc = could not find container \"589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202\": container with ID starting with 589e0248a420af7100ef9aa453a2f4a6e7f26f4dcac3b634dccef388e3f5a202 not found: ID does not exist" Feb 02 11:06:07 crc kubenswrapper[4901]: E0202 11:06:07.503259 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2fc407_51d4_4d62_914c_f83ac5ba1fb3.slice\": RecentStats: unable to find data in memory cache]" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.695785 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" path="/var/lib/kubelet/pods/fa2fc407-51d4-4d62-914c-f83ac5ba1fb3/volumes" Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.837774 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:06:07 crc kubenswrapper[4901]: I0202 11:06:07.837891 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:06:15 crc kubenswrapper[4901]: I0202 11:06:15.066447 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-h9fw8"] Feb 02 11:06:15 crc kubenswrapper[4901]: I0202 11:06:15.079584 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-h9fw8"] Feb 02 11:06:15 crc kubenswrapper[4901]: I0202 11:06:15.697342 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4426b588-ce3e-4184-bc0b-82f17522dd01" path="/var/lib/kubelet/pods/4426b588-ce3e-4184-bc0b-82f17522dd01/volumes" Feb 02 11:06:16 crc kubenswrapper[4901]: I0202 11:06:16.385409 4901 generic.go:334] "Generic (PLEG): container finished" podID="7e499a01-9c9c-44ca-a1d2-e35912fba103" containerID="61583392abc18254623a9540feac9d6111c1df32c8a1bb4059cd2d341ad7c760" exitCode=0 Feb 02 11:06:16 crc kubenswrapper[4901]: I0202 11:06:16.385464 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" event={"ID":"7e499a01-9c9c-44ca-a1d2-e35912fba103","Type":"ContainerDied","Data":"61583392abc18254623a9540feac9d6111c1df32c8a1bb4059cd2d341ad7c760"} Feb 02 11:06:17 crc kubenswrapper[4901]: I0202 11:06:17.889760 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.013175 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory\") pod \"7e499a01-9c9c-44ca-a1d2-e35912fba103\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.013530 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzl74\" (UniqueName: \"kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74\") pod \"7e499a01-9c9c-44ca-a1d2-e35912fba103\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.013806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam\") pod \"7e499a01-9c9c-44ca-a1d2-e35912fba103\" (UID: \"7e499a01-9c9c-44ca-a1d2-e35912fba103\") " Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.021928 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74" (OuterVolumeSpecName: "kube-api-access-nzl74") pod "7e499a01-9c9c-44ca-a1d2-e35912fba103" (UID: "7e499a01-9c9c-44ca-a1d2-e35912fba103"). InnerVolumeSpecName "kube-api-access-nzl74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.051092 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e499a01-9c9c-44ca-a1d2-e35912fba103" (UID: "7e499a01-9c9c-44ca-a1d2-e35912fba103"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.055421 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory" (OuterVolumeSpecName: "inventory") pod "7e499a01-9c9c-44ca-a1d2-e35912fba103" (UID: "7e499a01-9c9c-44ca-a1d2-e35912fba103"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.117181 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzl74\" (UniqueName: \"kubernetes.io/projected/7e499a01-9c9c-44ca-a1d2-e35912fba103-kube-api-access-nzl74\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.117226 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.117241 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e499a01-9c9c-44ca-a1d2-e35912fba103-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.411436 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" event={"ID":"7e499a01-9c9c-44ca-a1d2-e35912fba103","Type":"ContainerDied","Data":"5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec"} Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.411503 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab058a88c6a8eb2792c06dd303e10fc1ccac82835b10dbd6ba837c85ece18ec" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.411624 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.543720 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9"] Feb 02 11:06:18 crc kubenswrapper[4901]: E0202 11:06:18.544446 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="extract-utilities" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544475 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="extract-utilities" Feb 02 11:06:18 crc kubenswrapper[4901]: E0202 11:06:18.544501 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="extract-content" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544512 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="extract-content" Feb 02 11:06:18 crc kubenswrapper[4901]: E0202 11:06:18.544531 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="registry-server" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544540 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="registry-server" Feb 02 11:06:18 crc kubenswrapper[4901]: E0202 11:06:18.544590 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e499a01-9c9c-44ca-a1d2-e35912fba103" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544599 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e499a01-9c9c-44ca-a1d2-e35912fba103" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544837 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2fc407-51d4-4d62-914c-f83ac5ba1fb3" containerName="registry-server" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.544880 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e499a01-9c9c-44ca-a1d2-e35912fba103" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.546142 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.550453 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.551493 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.551854 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.553151 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.554257 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9"] Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.628984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.629140 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.629189 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlktw\" (UniqueName: \"kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.731253 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.733497 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.733860 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlktw\" (UniqueName: \"kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.736745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.736978 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.756607 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlktw\" (UniqueName: \"kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:18 crc kubenswrapper[4901]: I0202 11:06:18.875552 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:06:19 crc kubenswrapper[4901]: I0202 11:06:19.458631 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9"] Feb 02 11:06:20 crc kubenswrapper[4901]: I0202 11:06:20.435920 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" event={"ID":"2c354603-4591-439e-b60a-4c46e1b31678","Type":"ContainerStarted","Data":"d5859a93bf24bcbc4e01ea5cca72d612eff875506c8ffeab2642008b81fc3163"} Feb 02 11:06:20 crc kubenswrapper[4901]: I0202 11:06:20.436503 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" event={"ID":"2c354603-4591-439e-b60a-4c46e1b31678","Type":"ContainerStarted","Data":"2a4b0f3a58dd50b80985ec0636a8bbebce1905ca5732e83d1144f4fa05141f07"} Feb 02 11:06:20 crc kubenswrapper[4901]: I0202 11:06:20.460125 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" podStartSLOduration=1.9381282899999999 podStartE2EDuration="2.460104957s" podCreationTimestamp="2026-02-02 11:06:18 +0000 UTC" firstStartedPulling="2026-02-02 11:06:19.473712895 +0000 UTC m=+1666.492053001" lastFinishedPulling="2026-02-02 11:06:19.995689562 +0000 UTC m=+1667.014029668" observedRunningTime="2026-02-02 11:06:20.454265045 +0000 UTC m=+1667.472605151" watchObservedRunningTime="2026-02-02 11:06:20.460104957 +0000 UTC m=+1667.478445053" Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.049425 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8v2wt"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.062608 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xh5rm"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.081785 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xh5rm"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.092610 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cdcb9"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.100879 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8v2wt"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.107865 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cdcb9"] Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.709611 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093b2698-02da-479f-8d78-59e99a88d7c9" path="/var/lib/kubelet/pods/093b2698-02da-479f-8d78-59e99a88d7c9/volumes" Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.712517 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302af226-4326-48ef-bef0-02fab3943dbe" path="/var/lib/kubelet/pods/302af226-4326-48ef-bef0-02fab3943dbe/volumes" Feb 02 11:06:25 crc kubenswrapper[4901]: I0202 11:06:25.714191 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6995b916-7b6d-4b5e-8284-8b07fc09be1c" path="/var/lib/kubelet/pods/6995b916-7b6d-4b5e-8284-8b07fc09be1c/volumes" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.067530 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-hcpj4"] Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.085083 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2hz6j"] Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.102488 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-hcpj4"] Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.112744 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2hz6j"] Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.692690 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc5e40d-ef90-4040-a247-114b55e0efa1" path="/var/lib/kubelet/pods/9dc5e40d-ef90-4040-a247-114b55e0efa1/volumes" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.694064 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cccec66b-6bb1-4799-9385-73a33d1cacec" path="/var/lib/kubelet/pods/cccec66b-6bb1-4799-9385-73a33d1cacec/volumes" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.838013 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.838096 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.838146 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.838683 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:06:37 crc kubenswrapper[4901]: I0202 11:06:37.838746 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" gracePeriod=600 Feb 02 11:06:37 crc kubenswrapper[4901]: E0202 11:06:37.968718 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:06:38 crc kubenswrapper[4901]: I0202 11:06:38.656584 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" exitCode=0 Feb 02 11:06:38 crc kubenswrapper[4901]: I0202 11:06:38.656638 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd"} Feb 02 11:06:38 crc kubenswrapper[4901]: I0202 11:06:38.657057 4901 scope.go:117] "RemoveContainer" containerID="d12b3f34b51be08e2afea2c530360f371fc81fc3d568189c03f49b623affb0ff" Feb 02 11:06:38 crc kubenswrapper[4901]: I0202 11:06:38.658091 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:06:38 crc kubenswrapper[4901]: E0202 11:06:38.658700 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:06:41 crc kubenswrapper[4901]: I0202 11:06:41.807034 4901 scope.go:117] "RemoveContainer" containerID="ed61e656b80a17f0b5b5f0d5c3a178ca9ce7d41172db50d2e265e1c1b96489f2" Feb 02 11:06:41 crc kubenswrapper[4901]: I0202 11:06:41.869337 4901 scope.go:117] "RemoveContainer" containerID="0922b2b9b0497e40ada1bc3676e7fe6f72f77741d30f5607c490144792c6f340" Feb 02 11:06:41 crc kubenswrapper[4901]: I0202 11:06:41.922324 4901 scope.go:117] "RemoveContainer" containerID="a28a02d0bdc2931d65dc3f556e62817d0ff897942cd58cedfdf5703908e71552" Feb 02 11:06:41 crc kubenswrapper[4901]: I0202 11:06:41.980420 4901 scope.go:117] "RemoveContainer" containerID="ccbb00648e6b95f35fcdd653170188aa391581836312ac690b76d08b63da1456" Feb 02 11:06:42 crc kubenswrapper[4901]: I0202 11:06:42.023713 4901 scope.go:117] "RemoveContainer" containerID="8fd3da1c1853f0c537007d818cb99d6d002fc9920c2d2abbd4eee4162558f50b" Feb 02 11:06:42 crc kubenswrapper[4901]: I0202 11:06:42.079330 4901 scope.go:117] "RemoveContainer" containerID="0d4b36deec608eaf5895e2cca6089f88861ab4ce5d89ffe6b8e3debc06e6b9c8" Feb 02 11:06:42 crc kubenswrapper[4901]: I0202 11:06:42.162539 4901 scope.go:117] "RemoveContainer" containerID="befca0c3509f895cde608403ea7f89e7f0d5c37633d6304f56f765dd55b33a99" Feb 02 11:06:49 crc kubenswrapper[4901]: I0202 11:06:49.677706 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:06:49 crc kubenswrapper[4901]: E0202 11:06:49.679720 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:07:03 crc kubenswrapper[4901]: I0202 11:07:03.683122 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:07:03 crc kubenswrapper[4901]: E0202 11:07:03.684071 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:07:18 crc kubenswrapper[4901]: I0202 11:07:18.676974 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:07:18 crc kubenswrapper[4901]: E0202 11:07:18.678250 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:07:20 crc kubenswrapper[4901]: I0202 11:07:20.053393 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6237-account-create-update-fdqwt"] Feb 02 11:07:20 crc kubenswrapper[4901]: I0202 11:07:20.068283 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6237-account-create-update-fdqwt"] Feb 02 11:07:21 crc kubenswrapper[4901]: I0202 11:07:21.694130 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d314f1-b9e6-4e62-a599-533cf470eea2" path="/var/lib/kubelet/pods/24d314f1-b9e6-4e62-a599-533cf470eea2/volumes" Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.052982 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-680b-account-create-update-fxm6v"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.067400 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tgmtn"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.076886 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2351-account-create-update-q5sln"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.086267 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-680b-account-create-update-fxm6v"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.095744 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9ddr2"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.106025 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sbl76"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.115783 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tgmtn"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.124409 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2351-account-create-update-q5sln"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.132223 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9ddr2"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.143270 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sbl76"] Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.703050 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7c9bef-3927-44b7-928a-eb9584ec90ed" path="/var/lib/kubelet/pods/7b7c9bef-3927-44b7-928a-eb9584ec90ed/volumes" Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.704196 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f838bb-a2da-496c-9338-50c97222d215" path="/var/lib/kubelet/pods/80f838bb-a2da-496c-9338-50c97222d215/volumes" Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.705218 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96853bfa-70bb-4ce9-9900-618896081d6d" path="/var/lib/kubelet/pods/96853bfa-70bb-4ce9-9900-618896081d6d/volumes" Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.706926 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e" path="/var/lib/kubelet/pods/a1ac84d8-3f94-4dc8-ae48-61ab0cbd238e/volumes" Feb 02 11:07:23 crc kubenswrapper[4901]: I0202 11:07:23.708461 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d859d976-7f57-49ae-88d5-b6f6d641f470" path="/var/lib/kubelet/pods/d859d976-7f57-49ae-88d5-b6f6d641f470/volumes" Feb 02 11:07:30 crc kubenswrapper[4901]: I0202 11:07:30.677186 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:07:30 crc kubenswrapper[4901]: E0202 11:07:30.678596 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:07:40 crc kubenswrapper[4901]: I0202 11:07:40.395998 4901 generic.go:334] "Generic (PLEG): container finished" podID="2c354603-4591-439e-b60a-4c46e1b31678" containerID="d5859a93bf24bcbc4e01ea5cca72d612eff875506c8ffeab2642008b81fc3163" exitCode=0 Feb 02 11:07:40 crc kubenswrapper[4901]: I0202 11:07:40.396092 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" event={"ID":"2c354603-4591-439e-b60a-4c46e1b31678","Type":"ContainerDied","Data":"d5859a93bf24bcbc4e01ea5cca72d612eff875506c8ffeab2642008b81fc3163"} Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.850725 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.877031 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam\") pod \"2c354603-4591-439e-b60a-4c46e1b31678\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.877165 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlktw\" (UniqueName: \"kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw\") pod \"2c354603-4591-439e-b60a-4c46e1b31678\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.877274 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory\") pod \"2c354603-4591-439e-b60a-4c46e1b31678\" (UID: \"2c354603-4591-439e-b60a-4c46e1b31678\") " Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.907260 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw" (OuterVolumeSpecName: "kube-api-access-mlktw") pod "2c354603-4591-439e-b60a-4c46e1b31678" (UID: "2c354603-4591-439e-b60a-4c46e1b31678"). InnerVolumeSpecName "kube-api-access-mlktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.915495 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c354603-4591-439e-b60a-4c46e1b31678" (UID: "2c354603-4591-439e-b60a-4c46e1b31678"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.928284 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory" (OuterVolumeSpecName: "inventory") pod "2c354603-4591-439e-b60a-4c46e1b31678" (UID: "2c354603-4591-439e-b60a-4c46e1b31678"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.978888 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.978918 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlktw\" (UniqueName: \"kubernetes.io/projected/2c354603-4591-439e-b60a-4c46e1b31678-kube-api-access-mlktw\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:41 crc kubenswrapper[4901]: I0202 11:07:41.978929 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c354603-4591-439e-b60a-4c46e1b31678-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.363300 4901 scope.go:117] "RemoveContainer" containerID="3e42b46e5f7ae441b51e74662c769f4b33bf1d42c810cbadf0dcff123b843238" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.392429 4901 scope.go:117] "RemoveContainer" containerID="24da6466b7882c96be9d5883619b05a6b9bf9a76130936323ed843e7e44b2e14" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.418225 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" event={"ID":"2c354603-4591-439e-b60a-4c46e1b31678","Type":"ContainerDied","Data":"2a4b0f3a58dd50b80985ec0636a8bbebce1905ca5732e83d1144f4fa05141f07"} Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.418289 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4b0f3a58dd50b80985ec0636a8bbebce1905ca5732e83d1144f4fa05141f07" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.418254 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.481476 4901 scope.go:117] "RemoveContainer" containerID="ed65b107cc9d437e10c49c78175628db331a431bc3fbf2c78686746dd9b4bcd5" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.517911 4901 scope.go:117] "RemoveContainer" containerID="a3707cc47d224b422f8095a177de8a5f9ee619c50d20851c0754e43d40ce0a4e" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.534698 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b"] Feb 02 11:07:42 crc kubenswrapper[4901]: E0202 11:07:42.535300 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c354603-4591-439e-b60a-4c46e1b31678" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.535321 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c354603-4591-439e-b60a-4c46e1b31678" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.535624 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c354603-4591-439e-b60a-4c46e1b31678" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.536489 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.542013 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.542812 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.543264 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.543477 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.556588 4901 scope.go:117] "RemoveContainer" containerID="74a3a02c4d45f49079e11017d06fc4aaed58d9249d701467b17779e608a64b8d" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.557258 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b"] Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.584697 4901 scope.go:117] "RemoveContainer" containerID="252fa59c24152159ccf9762035b8bb767d7428a009f8e8be86aef2993df194a1" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.599303 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.599589 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.599683 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvhcz\" (UniqueName: \"kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.702402 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.702587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvhcz\" (UniqueName: \"kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.702970 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.709469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.713175 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.728894 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvhcz\" (UniqueName: \"kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:42 crc kubenswrapper[4901]: I0202 11:07:42.855030 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:43 crc kubenswrapper[4901]: I0202 11:07:43.465936 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b"] Feb 02 11:07:44 crc kubenswrapper[4901]: I0202 11:07:44.437852 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" event={"ID":"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9","Type":"ContainerStarted","Data":"69115fffa8060cba314dd49408a90cbbf569a77efe45b870c1c082829d2b4ea8"} Feb 02 11:07:44 crc kubenswrapper[4901]: I0202 11:07:44.438214 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" event={"ID":"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9","Type":"ContainerStarted","Data":"b9b3bbf044d5321348bf9cd9f711f25bb3b788370a8a7554561091591b460195"} Feb 02 11:07:44 crc kubenswrapper[4901]: I0202 11:07:44.468483 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" podStartSLOduration=2.038665242 podStartE2EDuration="2.468458862s" podCreationTimestamp="2026-02-02 11:07:42 +0000 UTC" firstStartedPulling="2026-02-02 11:07:43.479746335 +0000 UTC m=+1750.498086431" lastFinishedPulling="2026-02-02 11:07:43.909539955 +0000 UTC m=+1750.927880051" observedRunningTime="2026-02-02 11:07:44.458702095 +0000 UTC m=+1751.477042191" watchObservedRunningTime="2026-02-02 11:07:44.468458862 +0000 UTC m=+1751.486798958" Feb 02 11:07:44 crc kubenswrapper[4901]: I0202 11:07:44.676736 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:07:44 crc kubenswrapper[4901]: E0202 11:07:44.677334 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:07:49 crc kubenswrapper[4901]: I0202 11:07:49.491666 4901 generic.go:334] "Generic (PLEG): container finished" podID="5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" containerID="69115fffa8060cba314dd49408a90cbbf569a77efe45b870c1c082829d2b4ea8" exitCode=0 Feb 02 11:07:49 crc kubenswrapper[4901]: I0202 11:07:49.491741 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" event={"ID":"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9","Type":"ContainerDied","Data":"69115fffa8060cba314dd49408a90cbbf569a77efe45b870c1c082829d2b4ea8"} Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.034748 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.214887 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam\") pod \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.215002 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvhcz\" (UniqueName: \"kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz\") pod \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.215027 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory\") pod \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\" (UID: \"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9\") " Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.224952 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz" (OuterVolumeSpecName: "kube-api-access-nvhcz") pod "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" (UID: "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9"). InnerVolumeSpecName "kube-api-access-nvhcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.245206 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory" (OuterVolumeSpecName: "inventory") pod "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" (UID: "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.245243 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" (UID: "5b1278d5-723d-40a0-a7dc-b85ddcdbdde9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.317549 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.317586 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvhcz\" (UniqueName: \"kubernetes.io/projected/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-kube-api-access-nvhcz\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.317597 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b1278d5-723d-40a0-a7dc-b85ddcdbdde9-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.517835 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" event={"ID":"5b1278d5-723d-40a0-a7dc-b85ddcdbdde9","Type":"ContainerDied","Data":"b9b3bbf044d5321348bf9cd9f711f25bb3b788370a8a7554561091591b460195"} Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.517883 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9b3bbf044d5321348bf9cd9f711f25bb3b788370a8a7554561091591b460195" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.517971 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.625946 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z"] Feb 02 11:07:51 crc kubenswrapper[4901]: E0202 11:07:51.626515 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.626554 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.626878 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1278d5-723d-40a0-a7dc-b85ddcdbdde9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.627902 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.630792 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.632242 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.637726 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.641394 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.651399 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z"] Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.726635 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.727102 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.727184 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhnx\" (UniqueName: \"kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.830044 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.830235 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.831226 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhnx\" (UniqueName: \"kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.835488 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.836539 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.863841 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhnx\" (UniqueName: \"kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vws9z\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:51 crc kubenswrapper[4901]: I0202 11:07:51.949296 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:07:52 crc kubenswrapper[4901]: I0202 11:07:52.342767 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z"] Feb 02 11:07:52 crc kubenswrapper[4901]: I0202 11:07:52.532486 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" event={"ID":"1f654258-65f3-42c5-a4d8-c131145a91cc","Type":"ContainerStarted","Data":"bc67b37ad5d5a1f87c1de8ef060a4eee19cef9f271471f19f78de5730981b065"} Feb 02 11:07:53 crc kubenswrapper[4901]: I0202 11:07:53.070474 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tkq95"] Feb 02 11:07:53 crc kubenswrapper[4901]: I0202 11:07:53.077676 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tkq95"] Feb 02 11:07:53 crc kubenswrapper[4901]: I0202 11:07:53.543808 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" event={"ID":"1f654258-65f3-42c5-a4d8-c131145a91cc","Type":"ContainerStarted","Data":"b260a72db748504b49ae97a348a34675cb79d92aaecce85c12a80713301beb3a"} Feb 02 11:07:53 crc kubenswrapper[4901]: I0202 11:07:53.567496 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" podStartSLOduration=2.126244297 podStartE2EDuration="2.567470395s" podCreationTimestamp="2026-02-02 11:07:51 +0000 UTC" firstStartedPulling="2026-02-02 11:07:52.351753345 +0000 UTC m=+1759.370093451" lastFinishedPulling="2026-02-02 11:07:52.792979453 +0000 UTC m=+1759.811319549" observedRunningTime="2026-02-02 11:07:53.560161358 +0000 UTC m=+1760.578501474" watchObservedRunningTime="2026-02-02 11:07:53.567470395 +0000 UTC m=+1760.585810501" Feb 02 11:07:53 crc kubenswrapper[4901]: I0202 11:07:53.771762 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b136f1-ea46-4ebc-ade6-03f996a089ce" path="/var/lib/kubelet/pods/a8b136f1-ea46-4ebc-ade6-03f996a089ce/volumes" Feb 02 11:07:56 crc kubenswrapper[4901]: I0202 11:07:56.677598 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:07:56 crc kubenswrapper[4901]: E0202 11:07:56.678417 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:08:10 crc kubenswrapper[4901]: I0202 11:08:10.677834 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:08:10 crc kubenswrapper[4901]: E0202 11:08:10.679317 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:08:23 crc kubenswrapper[4901]: I0202 11:08:23.685639 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:08:23 crc kubenswrapper[4901]: E0202 11:08:23.686416 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:08:28 crc kubenswrapper[4901]: I0202 11:08:28.921448 4901 generic.go:334] "Generic (PLEG): container finished" podID="1f654258-65f3-42c5-a4d8-c131145a91cc" containerID="b260a72db748504b49ae97a348a34675cb79d92aaecce85c12a80713301beb3a" exitCode=0 Feb 02 11:08:28 crc kubenswrapper[4901]: I0202 11:08:28.921593 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" event={"ID":"1f654258-65f3-42c5-a4d8-c131145a91cc","Type":"ContainerDied","Data":"b260a72db748504b49ae97a348a34675cb79d92aaecce85c12a80713301beb3a"} Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.386071 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.451622 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbhnx\" (UniqueName: \"kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx\") pod \"1f654258-65f3-42c5-a4d8-c131145a91cc\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.452011 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam\") pod \"1f654258-65f3-42c5-a4d8-c131145a91cc\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.452228 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory\") pod \"1f654258-65f3-42c5-a4d8-c131145a91cc\" (UID: \"1f654258-65f3-42c5-a4d8-c131145a91cc\") " Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.459086 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx" (OuterVolumeSpecName: "kube-api-access-gbhnx") pod "1f654258-65f3-42c5-a4d8-c131145a91cc" (UID: "1f654258-65f3-42c5-a4d8-c131145a91cc"). InnerVolumeSpecName "kube-api-access-gbhnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.482441 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory" (OuterVolumeSpecName: "inventory") pod "1f654258-65f3-42c5-a4d8-c131145a91cc" (UID: "1f654258-65f3-42c5-a4d8-c131145a91cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.490670 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f654258-65f3-42c5-a4d8-c131145a91cc" (UID: "1f654258-65f3-42c5-a4d8-c131145a91cc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.556373 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbhnx\" (UniqueName: \"kubernetes.io/projected/1f654258-65f3-42c5-a4d8-c131145a91cc-kube-api-access-gbhnx\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.556412 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.556427 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f654258-65f3-42c5-a4d8-c131145a91cc-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.945201 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" event={"ID":"1f654258-65f3-42c5-a4d8-c131145a91cc","Type":"ContainerDied","Data":"bc67b37ad5d5a1f87c1de8ef060a4eee19cef9f271471f19f78de5730981b065"} Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.945267 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc67b37ad5d5a1f87c1de8ef060a4eee19cef9f271471f19f78de5730981b065" Feb 02 11:08:30 crc kubenswrapper[4901]: I0202 11:08:30.945274 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vws9z" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.047514 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h"] Feb 02 11:08:31 crc kubenswrapper[4901]: E0202 11:08:31.047999 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f654258-65f3-42c5-a4d8-c131145a91cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.048021 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f654258-65f3-42c5-a4d8-c131145a91cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.048223 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f654258-65f3-42c5-a4d8-c131145a91cc" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.048983 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.053018 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.053114 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.053113 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.055043 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.084057 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h"] Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.174497 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjmxk\" (UniqueName: \"kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.175010 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.175304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.278423 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjmxk\" (UniqueName: \"kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.278544 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.278724 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.286987 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.287788 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.301205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjmxk\" (UniqueName: \"kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:31 crc kubenswrapper[4901]: I0202 11:08:31.381993 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:08:32 crc kubenswrapper[4901]: I0202 11:08:32.002744 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h"] Feb 02 11:08:32 crc kubenswrapper[4901]: I0202 11:08:32.969338 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" event={"ID":"b013b75b-71ef-4701-9f44-298496253710","Type":"ContainerStarted","Data":"9cac547fdb71a70f1991c3781bac4968c5e950fb8e1a4fe6835e4f684b984a04"} Feb 02 11:08:32 crc kubenswrapper[4901]: I0202 11:08:32.970264 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" event={"ID":"b013b75b-71ef-4701-9f44-298496253710","Type":"ContainerStarted","Data":"90cfc924be8b529eff48aa31ec71b20d6310f8a6aa8a06163135d44c134c382c"} Feb 02 11:08:33 crc kubenswrapper[4901]: I0202 11:08:33.008875 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" podStartSLOduration=1.549208606 podStartE2EDuration="2.008844942s" podCreationTimestamp="2026-02-02 11:08:31 +0000 UTC" firstStartedPulling="2026-02-02 11:08:32.010087571 +0000 UTC m=+1799.028427667" lastFinishedPulling="2026-02-02 11:08:32.469723907 +0000 UTC m=+1799.488064003" observedRunningTime="2026-02-02 11:08:33.002139569 +0000 UTC m=+1800.020479695" watchObservedRunningTime="2026-02-02 11:08:33.008844942 +0000 UTC m=+1800.027185078" Feb 02 11:08:35 crc kubenswrapper[4901]: I0202 11:08:35.678211 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:08:35 crc kubenswrapper[4901]: E0202 11:08:35.679225 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:08:42 crc kubenswrapper[4901]: I0202 11:08:42.754339 4901 scope.go:117] "RemoveContainer" containerID="a2a7d78b63e56ffd1ba178e49403ee5ef5281be5b212d55719d95fa6b95e1047" Feb 02 11:08:46 crc kubenswrapper[4901]: I0202 11:08:46.087206 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-px8gs"] Feb 02 11:08:46 crc kubenswrapper[4901]: I0202 11:08:46.109699 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-px8gs"] Feb 02 11:08:47 crc kubenswrapper[4901]: I0202 11:08:47.050459 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b7wx4"] Feb 02 11:08:47 crc kubenswrapper[4901]: I0202 11:08:47.062966 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b7wx4"] Feb 02 11:08:47 crc kubenswrapper[4901]: I0202 11:08:47.689593 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8358c85-6b94-4c56-911c-3396c7e780e5" path="/var/lib/kubelet/pods/d8358c85-6b94-4c56-911c-3396c7e780e5/volumes" Feb 02 11:08:47 crc kubenswrapper[4901]: I0202 11:08:47.690524 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee49f3db-ba4a-4c9c-893e-edde04607cf4" path="/var/lib/kubelet/pods/ee49f3db-ba4a-4c9c-893e-edde04607cf4/volumes" Feb 02 11:08:48 crc kubenswrapper[4901]: I0202 11:08:48.677833 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:08:48 crc kubenswrapper[4901]: E0202 11:08:48.678931 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:08:59 crc kubenswrapper[4901]: I0202 11:08:59.677330 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:08:59 crc kubenswrapper[4901]: E0202 11:08:59.678583 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:09:13 crc kubenswrapper[4901]: I0202 11:09:13.689436 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:09:13 crc kubenswrapper[4901]: E0202 11:09:13.690814 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:09:23 crc kubenswrapper[4901]: I0202 11:09:23.554678 4901 generic.go:334] "Generic (PLEG): container finished" podID="b013b75b-71ef-4701-9f44-298496253710" containerID="9cac547fdb71a70f1991c3781bac4968c5e950fb8e1a4fe6835e4f684b984a04" exitCode=0 Feb 02 11:09:23 crc kubenswrapper[4901]: I0202 11:09:23.554779 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" event={"ID":"b013b75b-71ef-4701-9f44-298496253710","Type":"ContainerDied","Data":"9cac547fdb71a70f1991c3781bac4968c5e950fb8e1a4fe6835e4f684b984a04"} Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.028917 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.184009 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory\") pod \"b013b75b-71ef-4701-9f44-298496253710\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.184269 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjmxk\" (UniqueName: \"kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk\") pod \"b013b75b-71ef-4701-9f44-298496253710\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.184378 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam\") pod \"b013b75b-71ef-4701-9f44-298496253710\" (UID: \"b013b75b-71ef-4701-9f44-298496253710\") " Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.199039 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk" (OuterVolumeSpecName: "kube-api-access-mjmxk") pod "b013b75b-71ef-4701-9f44-298496253710" (UID: "b013b75b-71ef-4701-9f44-298496253710"). InnerVolumeSpecName "kube-api-access-mjmxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.216470 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory" (OuterVolumeSpecName: "inventory") pod "b013b75b-71ef-4701-9f44-298496253710" (UID: "b013b75b-71ef-4701-9f44-298496253710"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.217045 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b013b75b-71ef-4701-9f44-298496253710" (UID: "b013b75b-71ef-4701-9f44-298496253710"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.286917 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjmxk\" (UniqueName: \"kubernetes.io/projected/b013b75b-71ef-4701-9f44-298496253710-kube-api-access-mjmxk\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.287216 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.287295 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b013b75b-71ef-4701-9f44-298496253710-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.576207 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" event={"ID":"b013b75b-71ef-4701-9f44-298496253710","Type":"ContainerDied","Data":"90cfc924be8b529eff48aa31ec71b20d6310f8a6aa8a06163135d44c134c382c"} Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.576271 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90cfc924be8b529eff48aa31ec71b20d6310f8a6aa8a06163135d44c134c382c" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.576292 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.668064 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-75pt4"] Feb 02 11:09:25 crc kubenswrapper[4901]: E0202 11:09:25.668680 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b013b75b-71ef-4701-9f44-298496253710" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.668701 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b013b75b-71ef-4701-9f44-298496253710" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.668937 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b013b75b-71ef-4701-9f44-298496253710" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.672772 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.683230 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.683805 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.684029 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.684193 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.691400 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:09:25 crc kubenswrapper[4901]: E0202 11:09:25.706811 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.726692 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-75pt4"] Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.820961 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.821041 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74lc\" (UniqueName: \"kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.821504 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.923271 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.923340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v74lc\" (UniqueName: \"kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.923431 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.927340 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.927465 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:25 crc kubenswrapper[4901]: I0202 11:09:25.939398 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74lc\" (UniqueName: \"kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc\") pod \"ssh-known-hosts-edpm-deployment-75pt4\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:26 crc kubenswrapper[4901]: I0202 11:09:26.020166 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:26 crc kubenswrapper[4901]: I0202 11:09:26.597712 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-75pt4"] Feb 02 11:09:26 crc kubenswrapper[4901]: I0202 11:09:26.599008 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:09:27 crc kubenswrapper[4901]: I0202 11:09:27.594457 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" event={"ID":"fedb3e3c-4180-4787-9d18-4fd5af89ad69","Type":"ContainerStarted","Data":"ecfad147aa149fa62ea694ee5b7b504660a44b2dee41ddc150cd1d7383111443"} Feb 02 11:09:27 crc kubenswrapper[4901]: I0202 11:09:27.594859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" event={"ID":"fedb3e3c-4180-4787-9d18-4fd5af89ad69","Type":"ContainerStarted","Data":"bacb5f08175fba84a22f0e6aa7c6ff2517ccb34bfb702476246fd4e5014197fe"} Feb 02 11:09:27 crc kubenswrapper[4901]: I0202 11:09:27.619061 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" podStartSLOduration=2.2087353419999998 podStartE2EDuration="2.61903626s" podCreationTimestamp="2026-02-02 11:09:25 +0000 UTC" firstStartedPulling="2026-02-02 11:09:26.59873203 +0000 UTC m=+1853.617072116" lastFinishedPulling="2026-02-02 11:09:27.009032928 +0000 UTC m=+1854.027373034" observedRunningTime="2026-02-02 11:09:27.618297452 +0000 UTC m=+1854.636637548" watchObservedRunningTime="2026-02-02 11:09:27.61903626 +0000 UTC m=+1854.637376356" Feb 02 11:09:30 crc kubenswrapper[4901]: I0202 11:09:30.056868 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-grf2k"] Feb 02 11:09:30 crc kubenswrapper[4901]: I0202 11:09:30.070642 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-grf2k"] Feb 02 11:09:31 crc kubenswrapper[4901]: I0202 11:09:31.688265 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a2b10de-90f9-49b1-aa3e-92508a93e8ee" path="/var/lib/kubelet/pods/9a2b10de-90f9-49b1-aa3e-92508a93e8ee/volumes" Feb 02 11:09:34 crc kubenswrapper[4901]: I0202 11:09:34.681148 4901 generic.go:334] "Generic (PLEG): container finished" podID="fedb3e3c-4180-4787-9d18-4fd5af89ad69" containerID="ecfad147aa149fa62ea694ee5b7b504660a44b2dee41ddc150cd1d7383111443" exitCode=0 Feb 02 11:09:34 crc kubenswrapper[4901]: I0202 11:09:34.681219 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" event={"ID":"fedb3e3c-4180-4787-9d18-4fd5af89ad69","Type":"ContainerDied","Data":"ecfad147aa149fa62ea694ee5b7b504660a44b2dee41ddc150cd1d7383111443"} Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.152589 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.271051 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam\") pod \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.271352 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0\") pod \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.271589 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v74lc\" (UniqueName: \"kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc\") pod \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\" (UID: \"fedb3e3c-4180-4787-9d18-4fd5af89ad69\") " Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.279585 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc" (OuterVolumeSpecName: "kube-api-access-v74lc") pod "fedb3e3c-4180-4787-9d18-4fd5af89ad69" (UID: "fedb3e3c-4180-4787-9d18-4fd5af89ad69"). InnerVolumeSpecName "kube-api-access-v74lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.307462 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "fedb3e3c-4180-4787-9d18-4fd5af89ad69" (UID: "fedb3e3c-4180-4787-9d18-4fd5af89ad69"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.309008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fedb3e3c-4180-4787-9d18-4fd5af89ad69" (UID: "fedb3e3c-4180-4787-9d18-4fd5af89ad69"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.374495 4901 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.374541 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v74lc\" (UniqueName: \"kubernetes.io/projected/fedb3e3c-4180-4787-9d18-4fd5af89ad69-kube-api-access-v74lc\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.374556 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fedb3e3c-4180-4787-9d18-4fd5af89ad69-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.677483 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:09:36 crc kubenswrapper[4901]: E0202 11:09:36.677845 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.708951 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" event={"ID":"fedb3e3c-4180-4787-9d18-4fd5af89ad69","Type":"ContainerDied","Data":"bacb5f08175fba84a22f0e6aa7c6ff2517ccb34bfb702476246fd4e5014197fe"} Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.709004 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bacb5f08175fba84a22f0e6aa7c6ff2517ccb34bfb702476246fd4e5014197fe" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.709063 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-75pt4" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.933278 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw"] Feb 02 11:09:36 crc kubenswrapper[4901]: E0202 11:09:36.934138 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedb3e3c-4180-4787-9d18-4fd5af89ad69" containerName="ssh-known-hosts-edpm-deployment" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.934182 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedb3e3c-4180-4787-9d18-4fd5af89ad69" containerName="ssh-known-hosts-edpm-deployment" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.934696 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedb3e3c-4180-4787-9d18-4fd5af89ad69" containerName="ssh-known-hosts-edpm-deployment" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.937049 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.940899 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.941139 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.941396 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.946024 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:09:36 crc kubenswrapper[4901]: I0202 11:09:36.950318 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw"] Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.093913 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88r9\" (UniqueName: \"kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.094140 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.094218 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.197261 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.197466 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.197790 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x88r9\" (UniqueName: \"kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.202044 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.202469 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.222352 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88r9\" (UniqueName: \"kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s5msw\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.275855 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:37 crc kubenswrapper[4901]: I0202 11:09:37.860163 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw"] Feb 02 11:09:38 crc kubenswrapper[4901]: I0202 11:09:38.731957 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" event={"ID":"62710228-7012-4485-a5d9-16a9bf369635","Type":"ContainerStarted","Data":"e861ade0a25081c979c39397243e59345b54e7320b44f6856cdf309e99e33e72"} Feb 02 11:09:38 crc kubenswrapper[4901]: I0202 11:09:38.732349 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" event={"ID":"62710228-7012-4485-a5d9-16a9bf369635","Type":"ContainerStarted","Data":"4e48e9fcfd2a4cce685e37f4326a12f8055ca782c9551fbbc1574bdef211f1ef"} Feb 02 11:09:38 crc kubenswrapper[4901]: I0202 11:09:38.752840 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" podStartSLOduration=2.233288886 podStartE2EDuration="2.752814722s" podCreationTimestamp="2026-02-02 11:09:36 +0000 UTC" firstStartedPulling="2026-02-02 11:09:37.864631777 +0000 UTC m=+1864.882971873" lastFinishedPulling="2026-02-02 11:09:38.384157593 +0000 UTC m=+1865.402497709" observedRunningTime="2026-02-02 11:09:38.74780667 +0000 UTC m=+1865.766146776" watchObservedRunningTime="2026-02-02 11:09:38.752814722 +0000 UTC m=+1865.771154838" Feb 02 11:09:42 crc kubenswrapper[4901]: I0202 11:09:42.868372 4901 scope.go:117] "RemoveContainer" containerID="6d8e3dd61cc735b767b199f84010d082f33ddc91756d3fb40b7c35b5efa4d229" Feb 02 11:09:42 crc kubenswrapper[4901]: I0202 11:09:42.943091 4901 scope.go:117] "RemoveContainer" containerID="9a9695c64faa468b97a3f94bfbf166c1e23c20daf90103f098e6af798c72619d" Feb 02 11:09:43 crc kubenswrapper[4901]: I0202 11:09:43.038474 4901 scope.go:117] "RemoveContainer" containerID="189cac09a9b7419b7ab679faec956a27c371dbc1f64303a8bf54089b023f82ad" Feb 02 11:09:49 crc kubenswrapper[4901]: I0202 11:09:49.851476 4901 generic.go:334] "Generic (PLEG): container finished" podID="62710228-7012-4485-a5d9-16a9bf369635" containerID="e861ade0a25081c979c39397243e59345b54e7320b44f6856cdf309e99e33e72" exitCode=0 Feb 02 11:09:49 crc kubenswrapper[4901]: I0202 11:09:49.851746 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" event={"ID":"62710228-7012-4485-a5d9-16a9bf369635","Type":"ContainerDied","Data":"e861ade0a25081c979c39397243e59345b54e7320b44f6856cdf309e99e33e72"} Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.348391 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.362479 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory\") pod \"62710228-7012-4485-a5d9-16a9bf369635\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.362551 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam\") pod \"62710228-7012-4485-a5d9-16a9bf369635\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.362703 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x88r9\" (UniqueName: \"kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9\") pod \"62710228-7012-4485-a5d9-16a9bf369635\" (UID: \"62710228-7012-4485-a5d9-16a9bf369635\") " Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.389923 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9" (OuterVolumeSpecName: "kube-api-access-x88r9") pod "62710228-7012-4485-a5d9-16a9bf369635" (UID: "62710228-7012-4485-a5d9-16a9bf369635"). InnerVolumeSpecName "kube-api-access-x88r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.411386 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory" (OuterVolumeSpecName: "inventory") pod "62710228-7012-4485-a5d9-16a9bf369635" (UID: "62710228-7012-4485-a5d9-16a9bf369635"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.422582 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62710228-7012-4485-a5d9-16a9bf369635" (UID: "62710228-7012-4485-a5d9-16a9bf369635"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.468290 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.468336 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x88r9\" (UniqueName: \"kubernetes.io/projected/62710228-7012-4485-a5d9-16a9bf369635-kube-api-access-x88r9\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.468351 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62710228-7012-4485-a5d9-16a9bf369635-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.678250 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:09:51 crc kubenswrapper[4901]: E0202 11:09:51.678922 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.878778 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" event={"ID":"62710228-7012-4485-a5d9-16a9bf369635","Type":"ContainerDied","Data":"4e48e9fcfd2a4cce685e37f4326a12f8055ca782c9551fbbc1574bdef211f1ef"} Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.878829 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e48e9fcfd2a4cce685e37f4326a12f8055ca782c9551fbbc1574bdef211f1ef" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.878897 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s5msw" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.989160 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7"] Feb 02 11:09:51 crc kubenswrapper[4901]: E0202 11:09:51.989779 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62710228-7012-4485-a5d9-16a9bf369635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.989798 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="62710228-7012-4485-a5d9-16a9bf369635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.990005 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="62710228-7012-4485-a5d9-16a9bf369635" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.990857 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.994603 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.994694 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.994935 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:09:51 crc kubenswrapper[4901]: I0202 11:09:51.995179 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.007903 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7"] Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.085190 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q8xt\" (UniqueName: \"kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.085251 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.085661 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.188449 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.188587 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q8xt\" (UniqueName: \"kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.188616 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.197495 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.212607 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.220302 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q8xt\" (UniqueName: \"kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.322269 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:09:52 crc kubenswrapper[4901]: I0202 11:09:52.899766 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7"] Feb 02 11:09:52 crc kubenswrapper[4901]: W0202 11:09:52.903499 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a2b8c88_d6ea_4fe1_9719_ebb0b9d39798.slice/crio-ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431 WatchSource:0}: Error finding container ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431: Status 404 returned error can't find the container with id ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431 Feb 02 11:09:53 crc kubenswrapper[4901]: I0202 11:09:53.900294 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" event={"ID":"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798","Type":"ContainerStarted","Data":"529a5702263cba57aee54f5e5cd6b9f619559a6692901ae6961bfeaed9cdf2ee"} Feb 02 11:09:53 crc kubenswrapper[4901]: I0202 11:09:53.900370 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" event={"ID":"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798","Type":"ContainerStarted","Data":"ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431"} Feb 02 11:09:53 crc kubenswrapper[4901]: I0202 11:09:53.922016 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" podStartSLOduration=2.483781566 podStartE2EDuration="2.921958761s" podCreationTimestamp="2026-02-02 11:09:51 +0000 UTC" firstStartedPulling="2026-02-02 11:09:52.907617076 +0000 UTC m=+1879.925957172" lastFinishedPulling="2026-02-02 11:09:53.345794231 +0000 UTC m=+1880.364134367" observedRunningTime="2026-02-02 11:09:53.917285278 +0000 UTC m=+1880.935625374" watchObservedRunningTime="2026-02-02 11:09:53.921958761 +0000 UTC m=+1880.940298857" Feb 02 11:10:02 crc kubenswrapper[4901]: I0202 11:10:02.677815 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:10:02 crc kubenswrapper[4901]: E0202 11:10:02.678883 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:10:03 crc kubenswrapper[4901]: I0202 11:10:03.000124 4901 generic.go:334] "Generic (PLEG): container finished" podID="3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" containerID="529a5702263cba57aee54f5e5cd6b9f619559a6692901ae6961bfeaed9cdf2ee" exitCode=0 Feb 02 11:10:03 crc kubenswrapper[4901]: I0202 11:10:03.000191 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" event={"ID":"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798","Type":"ContainerDied","Data":"529a5702263cba57aee54f5e5cd6b9f619559a6692901ae6961bfeaed9cdf2ee"} Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.481844 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.620749 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam\") pod \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.620860 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q8xt\" (UniqueName: \"kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt\") pod \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.621368 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory\") pod \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\" (UID: \"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798\") " Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.630859 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt" (OuterVolumeSpecName: "kube-api-access-9q8xt") pod "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" (UID: "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798"). InnerVolumeSpecName "kube-api-access-9q8xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.651856 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" (UID: "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.659897 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory" (OuterVolumeSpecName: "inventory") pod "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" (UID: "3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.727515 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.727607 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:04 crc kubenswrapper[4901]: I0202 11:10:04.727626 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q8xt\" (UniqueName: \"kubernetes.io/projected/3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798-kube-api-access-9q8xt\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.032087 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" event={"ID":"3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798","Type":"ContainerDied","Data":"ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431"} Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.032196 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed800b4a2f843bf53303341f2109135cedfdcc3d7d30932282ed87dedadbb431" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.032143 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.129380 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd"] Feb 02 11:10:05 crc kubenswrapper[4901]: E0202 11:10:05.130548 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.130587 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.130806 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.131651 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137341 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137431 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137541 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137686 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137846 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.137977 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138059 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138130 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138166 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138208 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138239 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138284 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.138318 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4nrr\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142012 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142107 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142214 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142500 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142550 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142632 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142503 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.142977 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.155151 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd"] Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.239939 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240010 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4nrr\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240042 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240068 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240148 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240196 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240218 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240259 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240309 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240339 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240382 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240415 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240450 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.240477 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.247329 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.248120 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.248405 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.248535 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.248884 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.249016 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.249212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.249853 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.251149 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.252182 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.252201 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.252442 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.253152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.256763 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4nrr\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-smpsd\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:05 crc kubenswrapper[4901]: I0202 11:10:05.463946 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:06 crc kubenswrapper[4901]: I0202 11:10:06.091843 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd"] Feb 02 11:10:07 crc kubenswrapper[4901]: I0202 11:10:07.060636 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" event={"ID":"cdafb787-54a7-457b-8c61-94ddf99cbb8c","Type":"ContainerStarted","Data":"0f4e55b436fd70087c5aacea0a067818da7a7f7cd8e97231267f3397c0be6319"} Feb 02 11:10:08 crc kubenswrapper[4901]: I0202 11:10:08.071778 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" event={"ID":"cdafb787-54a7-457b-8c61-94ddf99cbb8c","Type":"ContainerStarted","Data":"d14fc66e05b9d89f99c09da824d75be1bf802090aac7fcce6e7c949664ca98e1"} Feb 02 11:10:08 crc kubenswrapper[4901]: I0202 11:10:08.100803 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" podStartSLOduration=2.205365408 podStartE2EDuration="3.100778918s" podCreationTimestamp="2026-02-02 11:10:05 +0000 UTC" firstStartedPulling="2026-02-02 11:10:06.101649424 +0000 UTC m=+1893.119989540" lastFinishedPulling="2026-02-02 11:10:06.997062934 +0000 UTC m=+1894.015403050" observedRunningTime="2026-02-02 11:10:08.094853805 +0000 UTC m=+1895.113193901" watchObservedRunningTime="2026-02-02 11:10:08.100778918 +0000 UTC m=+1895.119119014" Feb 02 11:10:17 crc kubenswrapper[4901]: I0202 11:10:17.678353 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:10:17 crc kubenswrapper[4901]: E0202 11:10:17.679482 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:10:30 crc kubenswrapper[4901]: I0202 11:10:30.677274 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:10:30 crc kubenswrapper[4901]: E0202 11:10:30.679667 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:10:42 crc kubenswrapper[4901]: I0202 11:10:42.677216 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:10:42 crc kubenswrapper[4901]: E0202 11:10:42.678591 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:10:44 crc kubenswrapper[4901]: I0202 11:10:44.472623 4901 generic.go:334] "Generic (PLEG): container finished" podID="cdafb787-54a7-457b-8c61-94ddf99cbb8c" containerID="d14fc66e05b9d89f99c09da824d75be1bf802090aac7fcce6e7c949664ca98e1" exitCode=0 Feb 02 11:10:44 crc kubenswrapper[4901]: I0202 11:10:44.472757 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" event={"ID":"cdafb787-54a7-457b-8c61-94ddf99cbb8c","Type":"ContainerDied","Data":"d14fc66e05b9d89f99c09da824d75be1bf802090aac7fcce6e7c949664ca98e1"} Feb 02 11:10:45 crc kubenswrapper[4901]: I0202 11:10:45.962407 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.053852 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054138 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054286 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054371 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054426 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054471 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4nrr\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.054593 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.055746 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.055812 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.055986 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.056329 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.056390 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.056446 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\" (UID: \"cdafb787-54a7-457b-8c61-94ddf99cbb8c\") " Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.066153 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr" (OuterVolumeSpecName: "kube-api-access-l4nrr") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "kube-api-access-l4nrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.066369 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.066464 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.066538 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.067562 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.068501 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.071191 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.073358 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.079225 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.080245 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.081665 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.089652 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.107884 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory" (OuterVolumeSpecName: "inventory") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.148926 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cdafb787-54a7-457b-8c61-94ddf99cbb8c" (UID: "cdafb787-54a7-457b-8c61-94ddf99cbb8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160194 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160242 4901 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160253 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160262 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4nrr\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-kube-api-access-l4nrr\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160273 4901 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160283 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160297 4901 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160310 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160320 4901 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160330 4901 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160343 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160355 4901 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafb787-54a7-457b-8c61-94ddf99cbb8c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160366 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.160379 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cdafb787-54a7-457b-8c61-94ddf99cbb8c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.498080 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.498054 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-smpsd" event={"ID":"cdafb787-54a7-457b-8c61-94ddf99cbb8c","Type":"ContainerDied","Data":"0f4e55b436fd70087c5aacea0a067818da7a7f7cd8e97231267f3397c0be6319"} Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.498745 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4e55b436fd70087c5aacea0a067818da7a7f7cd8e97231267f3397c0be6319" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.632011 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb"] Feb 02 11:10:46 crc kubenswrapper[4901]: E0202 11:10:46.632630 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdafb787-54a7-457b-8c61-94ddf99cbb8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.632659 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdafb787-54a7-457b-8c61-94ddf99cbb8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.632963 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdafb787-54a7-457b-8c61-94ddf99cbb8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.633916 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.636494 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.636730 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.637191 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.637380 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.638528 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.646868 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb"] Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.671786 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vlc\" (UniqueName: \"kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.671861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.671957 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.672028 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.672059 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.773837 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vlc\" (UniqueName: \"kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.773900 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.774004 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.774105 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.774131 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.775979 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.778850 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.779728 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.779782 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.794808 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vlc\" (UniqueName: \"kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mj2hb\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:46 crc kubenswrapper[4901]: I0202 11:10:46.979435 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:10:47 crc kubenswrapper[4901]: I0202 11:10:47.625818 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb"] Feb 02 11:10:48 crc kubenswrapper[4901]: I0202 11:10:48.527173 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" event={"ID":"e169f03d-d9b9-4e8e-8847-65e846bf5722","Type":"ContainerStarted","Data":"65627a74de50e0326e30a508f959e08f75816ca311570614591e80927ffe9009"} Feb 02 11:10:49 crc kubenswrapper[4901]: I0202 11:10:49.541323 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" event={"ID":"e169f03d-d9b9-4e8e-8847-65e846bf5722","Type":"ContainerStarted","Data":"faea1a11dcf8c8d8b41edd988b71f9203e213d26f8b30b2ba2dc9eb5495c9cf9"} Feb 02 11:10:49 crc kubenswrapper[4901]: I0202 11:10:49.588237 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" podStartSLOduration=2.973792123 podStartE2EDuration="3.588214551s" podCreationTimestamp="2026-02-02 11:10:46 +0000 UTC" firstStartedPulling="2026-02-02 11:10:47.639311446 +0000 UTC m=+1934.657651542" lastFinishedPulling="2026-02-02 11:10:48.253733874 +0000 UTC m=+1935.272073970" observedRunningTime="2026-02-02 11:10:49.580818731 +0000 UTC m=+1936.599158827" watchObservedRunningTime="2026-02-02 11:10:49.588214551 +0000 UTC m=+1936.606554647" Feb 02 11:10:55 crc kubenswrapper[4901]: I0202 11:10:55.678072 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:10:55 crc kubenswrapper[4901]: E0202 11:10:55.679663 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:11:06 crc kubenswrapper[4901]: I0202 11:11:06.677922 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:11:06 crc kubenswrapper[4901]: E0202 11:11:06.678898 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:11:21 crc kubenswrapper[4901]: I0202 11:11:21.677688 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:11:21 crc kubenswrapper[4901]: E0202 11:11:21.678773 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:11:34 crc kubenswrapper[4901]: I0202 11:11:34.677516 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:11:34 crc kubenswrapper[4901]: E0202 11:11:34.678468 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:11:47 crc kubenswrapper[4901]: I0202 11:11:47.678147 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:11:48 crc kubenswrapper[4901]: I0202 11:11:48.195336 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67"} Feb 02 11:11:55 crc kubenswrapper[4901]: I0202 11:11:55.295815 4901 generic.go:334] "Generic (PLEG): container finished" podID="e169f03d-d9b9-4e8e-8847-65e846bf5722" containerID="faea1a11dcf8c8d8b41edd988b71f9203e213d26f8b30b2ba2dc9eb5495c9cf9" exitCode=0 Feb 02 11:11:55 crc kubenswrapper[4901]: I0202 11:11:55.295956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" event={"ID":"e169f03d-d9b9-4e8e-8847-65e846bf5722","Type":"ContainerDied","Data":"faea1a11dcf8c8d8b41edd988b71f9203e213d26f8b30b2ba2dc9eb5495c9cf9"} Feb 02 11:11:56 crc kubenswrapper[4901]: I0202 11:11:56.855553 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.004824 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0\") pod \"e169f03d-d9b9-4e8e-8847-65e846bf5722\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.004919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2vlc\" (UniqueName: \"kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc\") pod \"e169f03d-d9b9-4e8e-8847-65e846bf5722\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.005146 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle\") pod \"e169f03d-d9b9-4e8e-8847-65e846bf5722\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.005177 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory\") pod \"e169f03d-d9b9-4e8e-8847-65e846bf5722\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.005251 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam\") pod \"e169f03d-d9b9-4e8e-8847-65e846bf5722\" (UID: \"e169f03d-d9b9-4e8e-8847-65e846bf5722\") " Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.016544 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc" (OuterVolumeSpecName: "kube-api-access-j2vlc") pod "e169f03d-d9b9-4e8e-8847-65e846bf5722" (UID: "e169f03d-d9b9-4e8e-8847-65e846bf5722"). InnerVolumeSpecName "kube-api-access-j2vlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.020616 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e169f03d-d9b9-4e8e-8847-65e846bf5722" (UID: "e169f03d-d9b9-4e8e-8847-65e846bf5722"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.041839 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e169f03d-d9b9-4e8e-8847-65e846bf5722" (UID: "e169f03d-d9b9-4e8e-8847-65e846bf5722"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.044168 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory" (OuterVolumeSpecName: "inventory") pod "e169f03d-d9b9-4e8e-8847-65e846bf5722" (UID: "e169f03d-d9b9-4e8e-8847-65e846bf5722"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.062362 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e169f03d-d9b9-4e8e-8847-65e846bf5722" (UID: "e169f03d-d9b9-4e8e-8847-65e846bf5722"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.108843 4901 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.108913 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2vlc\" (UniqueName: \"kubernetes.io/projected/e169f03d-d9b9-4e8e-8847-65e846bf5722-kube-api-access-j2vlc\") on node \"crc\" DevicePath \"\"" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.108928 4901 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.108941 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.108955 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e169f03d-d9b9-4e8e-8847-65e846bf5722-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.318526 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" event={"ID":"e169f03d-d9b9-4e8e-8847-65e846bf5722","Type":"ContainerDied","Data":"65627a74de50e0326e30a508f959e08f75816ca311570614591e80927ffe9009"} Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.319115 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65627a74de50e0326e30a508f959e08f75816ca311570614591e80927ffe9009" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.318666 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mj2hb" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.494421 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6"] Feb 02 11:11:57 crc kubenswrapper[4901]: E0202 11:11:57.495166 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e169f03d-d9b9-4e8e-8847-65e846bf5722" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.495188 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e169f03d-d9b9-4e8e-8847-65e846bf5722" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.495418 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e169f03d-d9b9-4e8e-8847-65e846bf5722" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.496414 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.500064 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.500398 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.500523 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.500680 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.502014 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.503234 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.509033 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6"] Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.625921 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.626277 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrsq\" (UniqueName: \"kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.626340 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.626417 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.626815 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.627241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729525 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729659 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrsq\" (UniqueName: \"kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729691 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729722 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729770 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.729865 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.737257 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.737353 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.738936 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.739405 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.745240 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.759079 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrsq\" (UniqueName: \"kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:57 crc kubenswrapper[4901]: I0202 11:11:57.827268 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:11:58 crc kubenswrapper[4901]: I0202 11:11:58.434614 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6"] Feb 02 11:11:58 crc kubenswrapper[4901]: W0202 11:11:58.444438 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8911f17d_aca5_4056_90a0_f6351983e4bf.slice/crio-5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97 WatchSource:0}: Error finding container 5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97: Status 404 returned error can't find the container with id 5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97 Feb 02 11:11:59 crc kubenswrapper[4901]: I0202 11:11:59.340839 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" event={"ID":"8911f17d-aca5-4056-90a0-f6351983e4bf","Type":"ContainerStarted","Data":"646e169986d0ce9c5f3b2f37226dc7d15a7b1c0a962d27f8563b19baf7a5bbc9"} Feb 02 11:11:59 crc kubenswrapper[4901]: I0202 11:11:59.341395 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" event={"ID":"8911f17d-aca5-4056-90a0-f6351983e4bf","Type":"ContainerStarted","Data":"5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97"} Feb 02 11:11:59 crc kubenswrapper[4901]: I0202 11:11:59.374126 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" podStartSLOduration=1.9117806389999998 podStartE2EDuration="2.374096755s" podCreationTimestamp="2026-02-02 11:11:57 +0000 UTC" firstStartedPulling="2026-02-02 11:11:58.448658854 +0000 UTC m=+2005.466998960" lastFinishedPulling="2026-02-02 11:11:58.91097496 +0000 UTC m=+2005.929315076" observedRunningTime="2026-02-02 11:11:59.356546548 +0000 UTC m=+2006.374886644" watchObservedRunningTime="2026-02-02 11:11:59.374096755 +0000 UTC m=+2006.392436851" Feb 02 11:12:49 crc kubenswrapper[4901]: I0202 11:12:49.850555 4901 generic.go:334] "Generic (PLEG): container finished" podID="8911f17d-aca5-4056-90a0-f6351983e4bf" containerID="646e169986d0ce9c5f3b2f37226dc7d15a7b1c0a962d27f8563b19baf7a5bbc9" exitCode=0 Feb 02 11:12:49 crc kubenswrapper[4901]: I0202 11:12:49.850664 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" event={"ID":"8911f17d-aca5-4056-90a0-f6351983e4bf","Type":"ContainerDied","Data":"646e169986d0ce9c5f3b2f37226dc7d15a7b1c0a962d27f8563b19baf7a5bbc9"} Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.398586 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.574275 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.574532 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.574673 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.574793 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlrsq\" (UniqueName: \"kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.575054 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.575137 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory\") pod \"8911f17d-aca5-4056-90a0-f6351983e4bf\" (UID: \"8911f17d-aca5-4056-90a0-f6351983e4bf\") " Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.588658 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq" (OuterVolumeSpecName: "kube-api-access-nlrsq") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "kube-api-access-nlrsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.590811 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.608709 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory" (OuterVolumeSpecName: "inventory") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.614288 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.626783 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.630711 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8911f17d-aca5-4056-90a0-f6351983e4bf" (UID: "8911f17d-aca5-4056-90a0-f6351983e4bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678290 4901 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678353 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678396 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678527 4901 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678650 4901 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8911f17d-aca5-4056-90a0-f6351983e4bf-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.678787 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlrsq\" (UniqueName: \"kubernetes.io/projected/8911f17d-aca5-4056-90a0-f6351983e4bf-kube-api-access-nlrsq\") on node \"crc\" DevicePath \"\"" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.878517 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" event={"ID":"8911f17d-aca5-4056-90a0-f6351983e4bf","Type":"ContainerDied","Data":"5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97"} Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.879078 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4f872cd902b8cb3d9617b8d4fc19dfd32290a76a5b85f9eb082b3aa5851a97" Feb 02 11:12:51 crc kubenswrapper[4901]: I0202 11:12:51.878693 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.049410 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7"] Feb 02 11:12:52 crc kubenswrapper[4901]: E0202 11:12:52.050142 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8911f17d-aca5-4056-90a0-f6351983e4bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.050169 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="8911f17d-aca5-4056-90a0-f6351983e4bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.050413 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="8911f17d-aca5-4056-90a0-f6351983e4bf" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.051310 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.054627 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.054940 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.055141 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.055311 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.055851 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.062707 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7"] Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.192123 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.192229 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtb4\" (UniqueName: \"kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.192265 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.192312 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.193228 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.295731 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.296928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtb4\" (UniqueName: \"kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.297074 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.297818 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.297949 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.302672 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.302770 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.306646 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.307425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.320310 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtb4\" (UniqueName: \"kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.379906 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:12:52 crc kubenswrapper[4901]: I0202 11:12:52.972818 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7"] Feb 02 11:12:53 crc kubenswrapper[4901]: I0202 11:12:53.902859 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" event={"ID":"688ef63a-dd78-45da-84ba-f5bc28c6ae81","Type":"ContainerStarted","Data":"e358dbf22fe68ed83917a35f3c77779aaac829c31772bc2d02bc42e25bf0361b"} Feb 02 11:12:53 crc kubenswrapper[4901]: I0202 11:12:53.903981 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" event={"ID":"688ef63a-dd78-45da-84ba-f5bc28c6ae81","Type":"ContainerStarted","Data":"9212c48483d6c6e3027dd3b458a9c69dbc8d7a65ac24565dca19cb40f8a38fe0"} Feb 02 11:12:53 crc kubenswrapper[4901]: I0202 11:12:53.934333 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" podStartSLOduration=1.4904591489999999 podStartE2EDuration="1.934298616s" podCreationTimestamp="2026-02-02 11:12:52 +0000 UTC" firstStartedPulling="2026-02-02 11:12:52.984175367 +0000 UTC m=+2060.002515463" lastFinishedPulling="2026-02-02 11:12:53.428014834 +0000 UTC m=+2060.446354930" observedRunningTime="2026-02-02 11:12:53.922858539 +0000 UTC m=+2060.941198655" watchObservedRunningTime="2026-02-02 11:12:53.934298616 +0000 UTC m=+2060.952638712" Feb 02 11:14:07 crc kubenswrapper[4901]: I0202 11:14:07.837493 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:14:07 crc kubenswrapper[4901]: I0202 11:14:07.838392 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:14:37 crc kubenswrapper[4901]: I0202 11:14:37.837457 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:14:37 crc kubenswrapper[4901]: I0202 11:14:37.838321 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.631273 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.636997 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.654161 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.750164 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.750244 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.751206 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrmf2\" (UniqueName: \"kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.853188 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrmf2\" (UniqueName: \"kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.853307 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.853329 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.853977 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.854076 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.877586 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrmf2\" (UniqueName: \"kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2\") pod \"certified-operators-rqpsh\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:40 crc kubenswrapper[4901]: I0202 11:14:40.966784 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:41 crc kubenswrapper[4901]: I0202 11:14:41.526510 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:42 crc kubenswrapper[4901]: I0202 11:14:42.184886 4901 generic.go:334] "Generic (PLEG): container finished" podID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerID="e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd" exitCode=0 Feb 02 11:14:42 crc kubenswrapper[4901]: I0202 11:14:42.185047 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerDied","Data":"e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd"} Feb 02 11:14:42 crc kubenswrapper[4901]: I0202 11:14:42.186002 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerStarted","Data":"b2e17f66ff22c2db3717a515289ffcc347dc593b82a0a2dd9d124ac2a603811e"} Feb 02 11:14:42 crc kubenswrapper[4901]: I0202 11:14:42.191423 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:14:43 crc kubenswrapper[4901]: I0202 11:14:43.203432 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerStarted","Data":"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e"} Feb 02 11:14:43 crc kubenswrapper[4901]: I0202 11:14:43.986494 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:43 crc kubenswrapper[4901]: I0202 11:14:43.988989 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.019425 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.032932 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.033074 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4sqz\" (UniqueName: \"kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.033162 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.135228 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.135381 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4sqz\" (UniqueName: \"kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.135483 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.136486 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.136819 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.159415 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4sqz\" (UniqueName: \"kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz\") pod \"community-operators-2bg7g\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.221465 4901 generic.go:334] "Generic (PLEG): container finished" podID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerID="3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e" exitCode=0 Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.221522 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerDied","Data":"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e"} Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.327581 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:44 crc kubenswrapper[4901]: I0202 11:14:44.920389 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:44 crc kubenswrapper[4901]: W0202 11:14:44.933212 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b9b9199_6de2_454d_90b2_bc75b443ac48.slice/crio-8e1a0b12e763a75942c9a17c3766f47b07e2558fafbccadc5f007660302b702c WatchSource:0}: Error finding container 8e1a0b12e763a75942c9a17c3766f47b07e2558fafbccadc5f007660302b702c: Status 404 returned error can't find the container with id 8e1a0b12e763a75942c9a17c3766f47b07e2558fafbccadc5f007660302b702c Feb 02 11:14:45 crc kubenswrapper[4901]: I0202 11:14:45.254639 4901 generic.go:334] "Generic (PLEG): container finished" podID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerID="8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d" exitCode=0 Feb 02 11:14:45 crc kubenswrapper[4901]: I0202 11:14:45.255531 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerDied","Data":"8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d"} Feb 02 11:14:45 crc kubenswrapper[4901]: I0202 11:14:45.255595 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerStarted","Data":"8e1a0b12e763a75942c9a17c3766f47b07e2558fafbccadc5f007660302b702c"} Feb 02 11:14:45 crc kubenswrapper[4901]: I0202 11:14:45.262251 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerStarted","Data":"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6"} Feb 02 11:14:45 crc kubenswrapper[4901]: I0202 11:14:45.310180 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqpsh" podStartSLOduration=2.854778124 podStartE2EDuration="5.310151084s" podCreationTimestamp="2026-02-02 11:14:40 +0000 UTC" firstStartedPulling="2026-02-02 11:14:42.19111722 +0000 UTC m=+2169.209457326" lastFinishedPulling="2026-02-02 11:14:44.64649019 +0000 UTC m=+2171.664830286" observedRunningTime="2026-02-02 11:14:45.299738451 +0000 UTC m=+2172.318078557" watchObservedRunningTime="2026-02-02 11:14:45.310151084 +0000 UTC m=+2172.328491190" Feb 02 11:14:46 crc kubenswrapper[4901]: I0202 11:14:46.274882 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerStarted","Data":"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40"} Feb 02 11:14:48 crc kubenswrapper[4901]: I0202 11:14:48.305196 4901 generic.go:334] "Generic (PLEG): container finished" podID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerID="a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40" exitCode=0 Feb 02 11:14:48 crc kubenswrapper[4901]: I0202 11:14:48.305365 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerDied","Data":"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40"} Feb 02 11:14:49 crc kubenswrapper[4901]: I0202 11:14:49.322858 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerStarted","Data":"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0"} Feb 02 11:14:49 crc kubenswrapper[4901]: I0202 11:14:49.352450 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2bg7g" podStartSLOduration=2.768168625 podStartE2EDuration="6.352427295s" podCreationTimestamp="2026-02-02 11:14:43 +0000 UTC" firstStartedPulling="2026-02-02 11:14:45.261490472 +0000 UTC m=+2172.279830578" lastFinishedPulling="2026-02-02 11:14:48.845749142 +0000 UTC m=+2175.864089248" observedRunningTime="2026-02-02 11:14:49.349608517 +0000 UTC m=+2176.367948653" watchObservedRunningTime="2026-02-02 11:14:49.352427295 +0000 UTC m=+2176.370767391" Feb 02 11:14:50 crc kubenswrapper[4901]: I0202 11:14:50.967812 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:50 crc kubenswrapper[4901]: I0202 11:14:50.968375 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:51 crc kubenswrapper[4901]: I0202 11:14:51.032391 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:51 crc kubenswrapper[4901]: I0202 11:14:51.400098 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:52 crc kubenswrapper[4901]: I0202 11:14:52.574438 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.367377 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqpsh" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="registry-server" containerID="cri-o://bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6" gracePeriod=2 Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.977162 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.984427 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content\") pod \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.984619 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities\") pod \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.984726 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrmf2\" (UniqueName: \"kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2\") pod \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\" (UID: \"afd76dbe-3c78-4710-9e8b-ee01a23d7929\") " Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.985607 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities" (OuterVolumeSpecName: "utilities") pod "afd76dbe-3c78-4710-9e8b-ee01a23d7929" (UID: "afd76dbe-3c78-4710-9e8b-ee01a23d7929"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:53 crc kubenswrapper[4901]: I0202 11:14:53.994151 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2" (OuterVolumeSpecName: "kube-api-access-mrmf2") pod "afd76dbe-3c78-4710-9e8b-ee01a23d7929" (UID: "afd76dbe-3c78-4710-9e8b-ee01a23d7929"). InnerVolumeSpecName "kube-api-access-mrmf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.044618 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afd76dbe-3c78-4710-9e8b-ee01a23d7929" (UID: "afd76dbe-3c78-4710-9e8b-ee01a23d7929"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.086261 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.086318 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afd76dbe-3c78-4710-9e8b-ee01a23d7929-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.086333 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrmf2\" (UniqueName: \"kubernetes.io/projected/afd76dbe-3c78-4710-9e8b-ee01a23d7929-kube-api-access-mrmf2\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.328281 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.328759 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.383177 4901 generic.go:334] "Generic (PLEG): container finished" podID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerID="bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6" exitCode=0 Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.383294 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqpsh" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.383434 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerDied","Data":"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6"} Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.383517 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqpsh" event={"ID":"afd76dbe-3c78-4710-9e8b-ee01a23d7929","Type":"ContainerDied","Data":"b2e17f66ff22c2db3717a515289ffcc347dc593b82a0a2dd9d124ac2a603811e"} Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.383597 4901 scope.go:117] "RemoveContainer" containerID="bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.391044 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.414182 4901 scope.go:117] "RemoveContainer" containerID="3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.459325 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.459484 4901 scope.go:117] "RemoveContainer" containerID="e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.469984 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqpsh"] Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.490885 4901 scope.go:117] "RemoveContainer" containerID="bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6" Feb 02 11:14:54 crc kubenswrapper[4901]: E0202 11:14:54.491483 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6\": container with ID starting with bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6 not found: ID does not exist" containerID="bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.491522 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6"} err="failed to get container status \"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6\": rpc error: code = NotFound desc = could not find container \"bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6\": container with ID starting with bdf279f462008dc6f39b26fe74f4acb5550073692ca7571f0eacc3a20df3baa6 not found: ID does not exist" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.491583 4901 scope.go:117] "RemoveContainer" containerID="3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e" Feb 02 11:14:54 crc kubenswrapper[4901]: E0202 11:14:54.492152 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e\": container with ID starting with 3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e not found: ID does not exist" containerID="3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.492227 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e"} err="failed to get container status \"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e\": rpc error: code = NotFound desc = could not find container \"3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e\": container with ID starting with 3bf2b5b96e9b0c0a637bcbb27145600814cbbe0dbe65038ddba4a1caf3b40c2e not found: ID does not exist" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.492259 4901 scope.go:117] "RemoveContainer" containerID="e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd" Feb 02 11:14:54 crc kubenswrapper[4901]: E0202 11:14:54.492798 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd\": container with ID starting with e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd not found: ID does not exist" containerID="e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd" Feb 02 11:14:54 crc kubenswrapper[4901]: I0202 11:14:54.492858 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd"} err="failed to get container status \"e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd\": rpc error: code = NotFound desc = could not find container \"e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd\": container with ID starting with e990d9e28e3ab2b08a5a250179bd262787b78c6d2af9b789d8166dfec58048dd not found: ID does not exist" Feb 02 11:14:55 crc kubenswrapper[4901]: I0202 11:14:55.471032 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:55 crc kubenswrapper[4901]: I0202 11:14:55.704967 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" path="/var/lib/kubelet/pods/afd76dbe-3c78-4710-9e8b-ee01a23d7929/volumes" Feb 02 11:14:56 crc kubenswrapper[4901]: I0202 11:14:56.793074 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:57 crc kubenswrapper[4901]: I0202 11:14:57.422412 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2bg7g" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="registry-server" containerID="cri-o://d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0" gracePeriod=2 Feb 02 11:14:57 crc kubenswrapper[4901]: I0202 11:14:57.959498 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.078253 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities\") pod \"0b9b9199-6de2-454d-90b2-bc75b443ac48\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.078425 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content\") pod \"0b9b9199-6de2-454d-90b2-bc75b443ac48\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.078505 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4sqz\" (UniqueName: \"kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz\") pod \"0b9b9199-6de2-454d-90b2-bc75b443ac48\" (UID: \"0b9b9199-6de2-454d-90b2-bc75b443ac48\") " Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.080400 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities" (OuterVolumeSpecName: "utilities") pod "0b9b9199-6de2-454d-90b2-bc75b443ac48" (UID: "0b9b9199-6de2-454d-90b2-bc75b443ac48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.088459 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz" (OuterVolumeSpecName: "kube-api-access-z4sqz") pod "0b9b9199-6de2-454d-90b2-bc75b443ac48" (UID: "0b9b9199-6de2-454d-90b2-bc75b443ac48"). InnerVolumeSpecName "kube-api-access-z4sqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.131735 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b9b9199-6de2-454d-90b2-bc75b443ac48" (UID: "0b9b9199-6de2-454d-90b2-bc75b443ac48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.181376 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.181412 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4sqz\" (UniqueName: \"kubernetes.io/projected/0b9b9199-6de2-454d-90b2-bc75b443ac48-kube-api-access-z4sqz\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.181424 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9b9199-6de2-454d-90b2-bc75b443ac48-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.437392 4901 generic.go:334] "Generic (PLEG): container finished" podID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerID="d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0" exitCode=0 Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.437496 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bg7g" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.437532 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerDied","Data":"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0"} Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.437995 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bg7g" event={"ID":"0b9b9199-6de2-454d-90b2-bc75b443ac48","Type":"ContainerDied","Data":"8e1a0b12e763a75942c9a17c3766f47b07e2558fafbccadc5f007660302b702c"} Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.438048 4901 scope.go:117] "RemoveContainer" containerID="d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.501644 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.503675 4901 scope.go:117] "RemoveContainer" containerID="a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.513984 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2bg7g"] Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.572090 4901 scope.go:117] "RemoveContainer" containerID="8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.594656 4901 scope.go:117] "RemoveContainer" containerID="d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0" Feb 02 11:14:58 crc kubenswrapper[4901]: E0202 11:14:58.595307 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0\": container with ID starting with d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0 not found: ID does not exist" containerID="d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.595341 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0"} err="failed to get container status \"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0\": rpc error: code = NotFound desc = could not find container \"d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0\": container with ID starting with d0a0316ab4ca789087a1591cf00b10d98a90a67c93630e331a2f57dea94c03d0 not found: ID does not exist" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.595366 4901 scope.go:117] "RemoveContainer" containerID="a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40" Feb 02 11:14:58 crc kubenswrapper[4901]: E0202 11:14:58.595693 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40\": container with ID starting with a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40 not found: ID does not exist" containerID="a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.595743 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40"} err="failed to get container status \"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40\": rpc error: code = NotFound desc = could not find container \"a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40\": container with ID starting with a86e04838326a0c5e3d00980347840a974dbca48b2188ab2ea4580544b132e40 not found: ID does not exist" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.595776 4901 scope.go:117] "RemoveContainer" containerID="8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d" Feb 02 11:14:58 crc kubenswrapper[4901]: E0202 11:14:58.596200 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d\": container with ID starting with 8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d not found: ID does not exist" containerID="8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d" Feb 02 11:14:58 crc kubenswrapper[4901]: I0202 11:14:58.596237 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d"} err="failed to get container status \"8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d\": rpc error: code = NotFound desc = could not find container \"8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d\": container with ID starting with 8196aa1a2031541a0329661d41162ae1d5de2942e521ae516345d4744f56c88d not found: ID does not exist" Feb 02 11:14:59 crc kubenswrapper[4901]: I0202 11:14:59.698102 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" path="/var/lib/kubelet/pods/0b9b9199-6de2-454d-90b2-bc75b443ac48/volumes" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.155439 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm"] Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.160348 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.160530 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.160632 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.160729 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.160840 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.160930 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.161017 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.161089 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.161167 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.161247 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: E0202 11:15:00.161328 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.161416 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.161790 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd76dbe-3c78-4710-9e8b-ee01a23d7929" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.162117 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9b9199-6de2-454d-90b2-bc75b443ac48" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.163246 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.167357 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.167417 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.186878 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm"] Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.270947 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.271122 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvq8f\" (UniqueName: \"kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.271347 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.373836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.374186 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvq8f\" (UniqueName: \"kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.374363 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.375550 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.380924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.398688 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvq8f\" (UniqueName: \"kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f\") pod \"collect-profiles-29500515-848bm\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.493129 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:00 crc kubenswrapper[4901]: W0202 11:15:00.991446 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded4e44d1_9a34_4d11_b42e_755a778baf5b.slice/crio-52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515 WatchSource:0}: Error finding container 52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515: Status 404 returned error can't find the container with id 52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515 Feb 02 11:15:00 crc kubenswrapper[4901]: I0202 11:15:00.992306 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm"] Feb 02 11:15:01 crc kubenswrapper[4901]: I0202 11:15:01.476145 4901 generic.go:334] "Generic (PLEG): container finished" podID="ed4e44d1-9a34-4d11-b42e-755a778baf5b" containerID="59a92ba08200fdedc9047b9920bc2d0837d3919d3644b88c7c652cc5758c0b04" exitCode=0 Feb 02 11:15:01 crc kubenswrapper[4901]: I0202 11:15:01.476241 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" event={"ID":"ed4e44d1-9a34-4d11-b42e-755a778baf5b","Type":"ContainerDied","Data":"59a92ba08200fdedc9047b9920bc2d0837d3919d3644b88c7c652cc5758c0b04"} Feb 02 11:15:01 crc kubenswrapper[4901]: I0202 11:15:01.476556 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" event={"ID":"ed4e44d1-9a34-4d11-b42e-755a778baf5b","Type":"ContainerStarted","Data":"52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515"} Feb 02 11:15:02 crc kubenswrapper[4901]: I0202 11:15:02.913982 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.045219 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume\") pod \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.045288 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume\") pod \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.045356 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvq8f\" (UniqueName: \"kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f\") pod \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\" (UID: \"ed4e44d1-9a34-4d11-b42e-755a778baf5b\") " Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.046884 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed4e44d1-9a34-4d11-b42e-755a778baf5b" (UID: "ed4e44d1-9a34-4d11-b42e-755a778baf5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.053524 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f" (OuterVolumeSpecName: "kube-api-access-lvq8f") pod "ed4e44d1-9a34-4d11-b42e-755a778baf5b" (UID: "ed4e44d1-9a34-4d11-b42e-755a778baf5b"). InnerVolumeSpecName "kube-api-access-lvq8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.053838 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed4e44d1-9a34-4d11-b42e-755a778baf5b" (UID: "ed4e44d1-9a34-4d11-b42e-755a778baf5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.148354 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed4e44d1-9a34-4d11-b42e-755a778baf5b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.148828 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed4e44d1-9a34-4d11-b42e-755a778baf5b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.148841 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvq8f\" (UniqueName: \"kubernetes.io/projected/ed4e44d1-9a34-4d11-b42e-755a778baf5b-kube-api-access-lvq8f\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.502429 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" event={"ID":"ed4e44d1-9a34-4d11-b42e-755a778baf5b","Type":"ContainerDied","Data":"52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515"} Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.502501 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52bcfad24bb0aecfce52bcd6425677bccb5d93648972245c403adbb4a9e43515" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.502609 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-848bm" Feb 02 11:15:03 crc kubenswrapper[4901]: I0202 11:15:03.999758 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7"] Feb 02 11:15:04 crc kubenswrapper[4901]: I0202 11:15:04.013823 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-46wx7"] Feb 02 11:15:05 crc kubenswrapper[4901]: I0202 11:15:05.695893 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a24390c-720d-4e6b-b3d7-a12eab3d72a6" path="/var/lib/kubelet/pods/8a24390c-720d-4e6b-b3d7-a12eab3d72a6/volumes" Feb 02 11:15:07 crc kubenswrapper[4901]: I0202 11:15:07.837962 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:15:07 crc kubenswrapper[4901]: I0202 11:15:07.838479 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:15:07 crc kubenswrapper[4901]: I0202 11:15:07.838546 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:15:07 crc kubenswrapper[4901]: I0202 11:15:07.839781 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:15:07 crc kubenswrapper[4901]: I0202 11:15:07.839839 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67" gracePeriod=600 Feb 02 11:15:08 crc kubenswrapper[4901]: I0202 11:15:08.555629 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67" exitCode=0 Feb 02 11:15:08 crc kubenswrapper[4901]: I0202 11:15:08.556042 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67"} Feb 02 11:15:08 crc kubenswrapper[4901]: I0202 11:15:08.556151 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b"} Feb 02 11:15:08 crc kubenswrapper[4901]: I0202 11:15:08.556182 4901 scope.go:117] "RemoveContainer" containerID="ffa74db446f03ef95c563afefd62346074442a8b53dd231baa8c9ad890a3defd" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.496996 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:09 crc kubenswrapper[4901]: E0202 11:15:09.504629 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4e44d1-9a34-4d11-b42e-755a778baf5b" containerName="collect-profiles" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.505968 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4e44d1-9a34-4d11-b42e-755a778baf5b" containerName="collect-profiles" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.508878 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4e44d1-9a34-4d11-b42e-755a778baf5b" containerName="collect-profiles" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.523891 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.535555 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.626659 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfm68\" (UniqueName: \"kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.626752 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.626951 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.728883 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.729794 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfm68\" (UniqueName: \"kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.729871 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.730964 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.731327 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.762456 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfm68\" (UniqueName: \"kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68\") pod \"redhat-operators-zd76k\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:09 crc kubenswrapper[4901]: I0202 11:15:09.872938 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:10 crc kubenswrapper[4901]: I0202 11:15:10.424023 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:10 crc kubenswrapper[4901]: I0202 11:15:10.612610 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerStarted","Data":"63c8416e69fe59154c277ad784d8d7374d7a6752aaf75c4b9fdde94292daecf7"} Feb 02 11:15:11 crc kubenswrapper[4901]: I0202 11:15:11.623232 4901 generic.go:334] "Generic (PLEG): container finished" podID="d10153e6-bfc4-4e68-b121-90166d53c259" containerID="6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80" exitCode=0 Feb 02 11:15:11 crc kubenswrapper[4901]: I0202 11:15:11.623322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerDied","Data":"6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80"} Feb 02 11:15:13 crc kubenswrapper[4901]: I0202 11:15:13.653550 4901 generic.go:334] "Generic (PLEG): container finished" podID="d10153e6-bfc4-4e68-b121-90166d53c259" containerID="b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876" exitCode=0 Feb 02 11:15:13 crc kubenswrapper[4901]: I0202 11:15:13.653720 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerDied","Data":"b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876"} Feb 02 11:15:14 crc kubenswrapper[4901]: I0202 11:15:14.668096 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerStarted","Data":"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9"} Feb 02 11:15:14 crc kubenswrapper[4901]: I0202 11:15:14.698643 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zd76k" podStartSLOduration=3.137291281 podStartE2EDuration="5.698619484s" podCreationTimestamp="2026-02-02 11:15:09 +0000 UTC" firstStartedPulling="2026-02-02 11:15:11.625387962 +0000 UTC m=+2198.643728058" lastFinishedPulling="2026-02-02 11:15:14.186716165 +0000 UTC m=+2201.205056261" observedRunningTime="2026-02-02 11:15:14.692475865 +0000 UTC m=+2201.710815971" watchObservedRunningTime="2026-02-02 11:15:14.698619484 +0000 UTC m=+2201.716959570" Feb 02 11:15:19 crc kubenswrapper[4901]: I0202 11:15:19.874116 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:19 crc kubenswrapper[4901]: I0202 11:15:19.875000 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:20 crc kubenswrapper[4901]: I0202 11:15:20.928953 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zd76k" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="registry-server" probeResult="failure" output=< Feb 02 11:15:20 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:15:20 crc kubenswrapper[4901]: > Feb 02 11:15:29 crc kubenswrapper[4901]: I0202 11:15:29.944011 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:30 crc kubenswrapper[4901]: I0202 11:15:30.000120 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:30 crc kubenswrapper[4901]: I0202 11:15:30.198876 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:31 crc kubenswrapper[4901]: I0202 11:15:31.857180 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zd76k" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="registry-server" containerID="cri-o://8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9" gracePeriod=2 Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.343895 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.526040 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content\") pod \"d10153e6-bfc4-4e68-b121-90166d53c259\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.526168 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities\") pod \"d10153e6-bfc4-4e68-b121-90166d53c259\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.526233 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfm68\" (UniqueName: \"kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68\") pod \"d10153e6-bfc4-4e68-b121-90166d53c259\" (UID: \"d10153e6-bfc4-4e68-b121-90166d53c259\") " Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.528407 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities" (OuterVolumeSpecName: "utilities") pod "d10153e6-bfc4-4e68-b121-90166d53c259" (UID: "d10153e6-bfc4-4e68-b121-90166d53c259"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.536392 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68" (OuterVolumeSpecName: "kube-api-access-zfm68") pod "d10153e6-bfc4-4e68-b121-90166d53c259" (UID: "d10153e6-bfc4-4e68-b121-90166d53c259"). InnerVolumeSpecName "kube-api-access-zfm68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.635805 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.635863 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfm68\" (UniqueName: \"kubernetes.io/projected/d10153e6-bfc4-4e68-b121-90166d53c259-kube-api-access-zfm68\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.668265 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d10153e6-bfc4-4e68-b121-90166d53c259" (UID: "d10153e6-bfc4-4e68-b121-90166d53c259"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.737604 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d10153e6-bfc4-4e68-b121-90166d53c259-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.873126 4901 generic.go:334] "Generic (PLEG): container finished" podID="d10153e6-bfc4-4e68-b121-90166d53c259" containerID="8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9" exitCode=0 Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.873190 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerDied","Data":"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9"} Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.873223 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd76k" event={"ID":"d10153e6-bfc4-4e68-b121-90166d53c259","Type":"ContainerDied","Data":"63c8416e69fe59154c277ad784d8d7374d7a6752aaf75c4b9fdde94292daecf7"} Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.873243 4901 scope.go:117] "RemoveContainer" containerID="8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.873404 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd76k" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.921369 4901 scope.go:117] "RemoveContainer" containerID="b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876" Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.923581 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.933540 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zd76k"] Feb 02 11:15:32 crc kubenswrapper[4901]: I0202 11:15:32.956124 4901 scope.go:117] "RemoveContainer" containerID="6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.009908 4901 scope.go:117] "RemoveContainer" containerID="8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9" Feb 02 11:15:33 crc kubenswrapper[4901]: E0202 11:15:33.010702 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9\": container with ID starting with 8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9 not found: ID does not exist" containerID="8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.010758 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9"} err="failed to get container status \"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9\": rpc error: code = NotFound desc = could not find container \"8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9\": container with ID starting with 8a6c40244353e75b2c08cdf91c05aedf47b0c63915b306af1dd794d75e2dfcf9 not found: ID does not exist" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.010797 4901 scope.go:117] "RemoveContainer" containerID="b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876" Feb 02 11:15:33 crc kubenswrapper[4901]: E0202 11:15:33.011705 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876\": container with ID starting with b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876 not found: ID does not exist" containerID="b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.011809 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876"} err="failed to get container status \"b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876\": rpc error: code = NotFound desc = could not find container \"b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876\": container with ID starting with b9ed6406b79920d890e0ade228df9cbe120460a84c3ed898a695bbe53fbb4876 not found: ID does not exist" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.011845 4901 scope.go:117] "RemoveContainer" containerID="6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80" Feb 02 11:15:33 crc kubenswrapper[4901]: E0202 11:15:33.012208 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80\": container with ID starting with 6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80 not found: ID does not exist" containerID="6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.012240 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80"} err="failed to get container status \"6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80\": rpc error: code = NotFound desc = could not find container \"6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80\": container with ID starting with 6ea90a6e4a827a4d02a1139c4f1c8549c10a43389479d2de0af0538dd0272f80 not found: ID does not exist" Feb 02 11:15:33 crc kubenswrapper[4901]: I0202 11:15:33.711959 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" path="/var/lib/kubelet/pods/d10153e6-bfc4-4e68-b121-90166d53c259/volumes" Feb 02 11:15:43 crc kubenswrapper[4901]: I0202 11:15:43.270172 4901 scope.go:117] "RemoveContainer" containerID="2b239683c821bfe191569f242fe16d844f67e4ac9864792d6f6ba4778e203ae7" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.714058 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:00 crc kubenswrapper[4901]: E0202 11:16:00.715556 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="extract-content" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.715609 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="extract-content" Feb 02 11:16:00 crc kubenswrapper[4901]: E0202 11:16:00.715627 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="extract-utilities" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.715637 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="extract-utilities" Feb 02 11:16:00 crc kubenswrapper[4901]: E0202 11:16:00.715658 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="registry-server" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.715667 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="registry-server" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.715949 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10153e6-bfc4-4e68-b121-90166d53c259" containerName="registry-server" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.717846 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.729317 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.803450 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.803525 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.803719 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9dw\" (UniqueName: \"kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.905653 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.905723 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.905858 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9dw\" (UniqueName: \"kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.906168 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.906601 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:00 crc kubenswrapper[4901]: I0202 11:16:00.936039 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9dw\" (UniqueName: \"kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw\") pod \"redhat-marketplace-fdc2m\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:01 crc kubenswrapper[4901]: I0202 11:16:01.042178 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:01 crc kubenswrapper[4901]: I0202 11:16:01.640496 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:02 crc kubenswrapper[4901]: I0202 11:16:02.265822 4901 generic.go:334] "Generic (PLEG): container finished" podID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerID="65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a" exitCode=0 Feb 02 11:16:02 crc kubenswrapper[4901]: I0202 11:16:02.265925 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerDied","Data":"65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a"} Feb 02 11:16:02 crc kubenswrapper[4901]: I0202 11:16:02.266280 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerStarted","Data":"6667b7c42006a551502fa5f4cbaf752c783272f129449ef1979d494557e18129"} Feb 02 11:16:03 crc kubenswrapper[4901]: I0202 11:16:03.280142 4901 generic.go:334] "Generic (PLEG): container finished" podID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerID="dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6" exitCode=0 Feb 02 11:16:03 crc kubenswrapper[4901]: I0202 11:16:03.280227 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerDied","Data":"dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6"} Feb 02 11:16:04 crc kubenswrapper[4901]: I0202 11:16:04.295068 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerStarted","Data":"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790"} Feb 02 11:16:04 crc kubenswrapper[4901]: I0202 11:16:04.320003 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fdc2m" podStartSLOduration=2.863459942 podStartE2EDuration="4.319977452s" podCreationTimestamp="2026-02-02 11:16:00 +0000 UTC" firstStartedPulling="2026-02-02 11:16:02.267433119 +0000 UTC m=+2249.285773215" lastFinishedPulling="2026-02-02 11:16:03.723950629 +0000 UTC m=+2250.742290725" observedRunningTime="2026-02-02 11:16:04.319357248 +0000 UTC m=+2251.337697364" watchObservedRunningTime="2026-02-02 11:16:04.319977452 +0000 UTC m=+2251.338317548" Feb 02 11:16:11 crc kubenswrapper[4901]: I0202 11:16:11.042486 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:11 crc kubenswrapper[4901]: I0202 11:16:11.043298 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:11 crc kubenswrapper[4901]: I0202 11:16:11.099243 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:11 crc kubenswrapper[4901]: I0202 11:16:11.440944 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:11 crc kubenswrapper[4901]: I0202 11:16:11.515808 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:13 crc kubenswrapper[4901]: I0202 11:16:13.393928 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fdc2m" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="registry-server" containerID="cri-o://6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790" gracePeriod=2 Feb 02 11:16:13 crc kubenswrapper[4901]: I0202 11:16:13.926219 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.053795 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities\") pod \"e67e19f7-23de-4a89-b745-10d47f0d7495\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.054165 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9dw\" (UniqueName: \"kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw\") pod \"e67e19f7-23de-4a89-b745-10d47f0d7495\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.054298 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content\") pod \"e67e19f7-23de-4a89-b745-10d47f0d7495\" (UID: \"e67e19f7-23de-4a89-b745-10d47f0d7495\") " Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.054961 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities" (OuterVolumeSpecName: "utilities") pod "e67e19f7-23de-4a89-b745-10d47f0d7495" (UID: "e67e19f7-23de-4a89-b745-10d47f0d7495"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.056660 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.062074 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw" (OuterVolumeSpecName: "kube-api-access-wm9dw") pod "e67e19f7-23de-4a89-b745-10d47f0d7495" (UID: "e67e19f7-23de-4a89-b745-10d47f0d7495"). InnerVolumeSpecName "kube-api-access-wm9dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.081552 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e67e19f7-23de-4a89-b745-10d47f0d7495" (UID: "e67e19f7-23de-4a89-b745-10d47f0d7495"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.158925 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e67e19f7-23de-4a89-b745-10d47f0d7495-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.159185 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9dw\" (UniqueName: \"kubernetes.io/projected/e67e19f7-23de-4a89-b745-10d47f0d7495-kube-api-access-wm9dw\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.405570 4901 generic.go:334] "Generic (PLEG): container finished" podID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerID="6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790" exitCode=0 Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.405639 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdc2m" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.405630 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerDied","Data":"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790"} Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.406749 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdc2m" event={"ID":"e67e19f7-23de-4a89-b745-10d47f0d7495","Type":"ContainerDied","Data":"6667b7c42006a551502fa5f4cbaf752c783272f129449ef1979d494557e18129"} Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.406775 4901 scope.go:117] "RemoveContainer" containerID="6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.450838 4901 scope.go:117] "RemoveContainer" containerID="dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.455290 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.471199 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdc2m"] Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.485312 4901 scope.go:117] "RemoveContainer" containerID="65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.524080 4901 scope.go:117] "RemoveContainer" containerID="6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790" Feb 02 11:16:14 crc kubenswrapper[4901]: E0202 11:16:14.524824 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790\": container with ID starting with 6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790 not found: ID does not exist" containerID="6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.524887 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790"} err="failed to get container status \"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790\": rpc error: code = NotFound desc = could not find container \"6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790\": container with ID starting with 6a3fec7c0a9d3919b2e078655af0c43d7d87e9bc24ac290cde8971f37fd47790 not found: ID does not exist" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.524926 4901 scope.go:117] "RemoveContainer" containerID="dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6" Feb 02 11:16:14 crc kubenswrapper[4901]: E0202 11:16:14.525686 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6\": container with ID starting with dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6 not found: ID does not exist" containerID="dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.525728 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6"} err="failed to get container status \"dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6\": rpc error: code = NotFound desc = could not find container \"dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6\": container with ID starting with dd77142de101fc8f6604f5e161f98209a3973296a8669a4358341e52280bfbd6 not found: ID does not exist" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.525756 4901 scope.go:117] "RemoveContainer" containerID="65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a" Feb 02 11:16:14 crc kubenswrapper[4901]: E0202 11:16:14.526140 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a\": container with ID starting with 65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a not found: ID does not exist" containerID="65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a" Feb 02 11:16:14 crc kubenswrapper[4901]: I0202 11:16:14.526168 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a"} err="failed to get container status \"65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a\": rpc error: code = NotFound desc = could not find container \"65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a\": container with ID starting with 65a59c51d7c38c38f29b12d765050ddb7486824c2424eebb8a933b9aacbb296a not found: ID does not exist" Feb 02 11:16:15 crc kubenswrapper[4901]: I0202 11:16:15.691702 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" path="/var/lib/kubelet/pods/e67e19f7-23de-4a89-b745-10d47f0d7495/volumes" Feb 02 11:17:37 crc kubenswrapper[4901]: I0202 11:17:37.837689 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:17:37 crc kubenswrapper[4901]: I0202 11:17:37.838421 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:18:07 crc kubenswrapper[4901]: I0202 11:18:07.837038 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:18:07 crc kubenswrapper[4901]: I0202 11:18:07.837843 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:18:19 crc kubenswrapper[4901]: I0202 11:18:19.006041 4901 generic.go:334] "Generic (PLEG): container finished" podID="688ef63a-dd78-45da-84ba-f5bc28c6ae81" containerID="e358dbf22fe68ed83917a35f3c77779aaac829c31772bc2d02bc42e25bf0361b" exitCode=0 Feb 02 11:18:19 crc kubenswrapper[4901]: I0202 11:18:19.006143 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" event={"ID":"688ef63a-dd78-45da-84ba-f5bc28c6ae81","Type":"ContainerDied","Data":"e358dbf22fe68ed83917a35f3c77779aaac829c31772bc2d02bc42e25bf0361b"} Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.533265 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.685180 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jtb4\" (UniqueName: \"kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4\") pod \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.685273 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory\") pod \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.685316 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0\") pod \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.686443 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam\") pod \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.686506 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle\") pod \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\" (UID: \"688ef63a-dd78-45da-84ba-f5bc28c6ae81\") " Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.695497 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "688ef63a-dd78-45da-84ba-f5bc28c6ae81" (UID: "688ef63a-dd78-45da-84ba-f5bc28c6ae81"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.695550 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4" (OuterVolumeSpecName: "kube-api-access-9jtb4") pod "688ef63a-dd78-45da-84ba-f5bc28c6ae81" (UID: "688ef63a-dd78-45da-84ba-f5bc28c6ae81"). InnerVolumeSpecName "kube-api-access-9jtb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.718398 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "688ef63a-dd78-45da-84ba-f5bc28c6ae81" (UID: "688ef63a-dd78-45da-84ba-f5bc28c6ae81"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.725693 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory" (OuterVolumeSpecName: "inventory") pod "688ef63a-dd78-45da-84ba-f5bc28c6ae81" (UID: "688ef63a-dd78-45da-84ba-f5bc28c6ae81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.735705 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "688ef63a-dd78-45da-84ba-f5bc28c6ae81" (UID: "688ef63a-dd78-45da-84ba-f5bc28c6ae81"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.792402 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.792845 4901 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.792980 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jtb4\" (UniqueName: \"kubernetes.io/projected/688ef63a-dd78-45da-84ba-f5bc28c6ae81-kube-api-access-9jtb4\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.793104 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:20 crc kubenswrapper[4901]: I0202 11:18:20.793215 4901 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/688ef63a-dd78-45da-84ba-f5bc28c6ae81-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.037065 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" event={"ID":"688ef63a-dd78-45da-84ba-f5bc28c6ae81","Type":"ContainerDied","Data":"9212c48483d6c6e3027dd3b458a9c69dbc8d7a65ac24565dca19cb40f8a38fe0"} Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.037610 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9212c48483d6c6e3027dd3b458a9c69dbc8d7a65ac24565dca19cb40f8a38fe0" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.037190 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.163938 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6"] Feb 02 11:18:21 crc kubenswrapper[4901]: E0202 11:18:21.164682 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="extract-utilities" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.164705 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="extract-utilities" Feb 02 11:18:21 crc kubenswrapper[4901]: E0202 11:18:21.164721 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="registry-server" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.164728 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="registry-server" Feb 02 11:18:21 crc kubenswrapper[4901]: E0202 11:18:21.164737 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688ef63a-dd78-45da-84ba-f5bc28c6ae81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.164745 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="688ef63a-dd78-45da-84ba-f5bc28c6ae81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 02 11:18:21 crc kubenswrapper[4901]: E0202 11:18:21.164788 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="extract-content" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.164795 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="extract-content" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.165057 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="688ef63a-dd78-45da-84ba-f5bc28c6ae81" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.165163 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67e19f7-23de-4a89-b745-10d47f0d7495" containerName="registry-server" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.166392 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.168888 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.169459 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.169802 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.170015 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.170795 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.171021 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.171421 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.178874 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6"] Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207031 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8wk\" (UniqueName: \"kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207113 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207149 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207241 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207268 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207316 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207456 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207610 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.207764 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.308696 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.308840 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.308927 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8wk\" (UniqueName: \"kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.308976 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309020 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309125 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309181 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.309955 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.317432 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.317432 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.318662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.318942 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.320368 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.322217 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.325794 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.329014 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8wk\" (UniqueName: \"kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-g4dt6\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:21 crc kubenswrapper[4901]: I0202 11:18:21.488057 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:18:22 crc kubenswrapper[4901]: I0202 11:18:22.112534 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6"] Feb 02 11:18:23 crc kubenswrapper[4901]: I0202 11:18:23.063894 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" event={"ID":"9bca60a0-6b7b-4513-bec4-c2aa578b5607","Type":"ContainerStarted","Data":"a673426ee283a0a47e4252587d974229d921057bb58acc227f4764686724b5d9"} Feb 02 11:18:24 crc kubenswrapper[4901]: I0202 11:18:24.075306 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" event={"ID":"9bca60a0-6b7b-4513-bec4-c2aa578b5607","Type":"ContainerStarted","Data":"3157217286b1aa10c141b51bff70f08bdeb37c4b043d65d783ad78ca35ae431e"} Feb 02 11:18:24 crc kubenswrapper[4901]: I0202 11:18:24.097748 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" podStartSLOduration=2.251108717 podStartE2EDuration="3.097725987s" podCreationTimestamp="2026-02-02 11:18:21 +0000 UTC" firstStartedPulling="2026-02-02 11:18:22.127977033 +0000 UTC m=+2389.146317139" lastFinishedPulling="2026-02-02 11:18:22.974594303 +0000 UTC m=+2389.992934409" observedRunningTime="2026-02-02 11:18:24.092269454 +0000 UTC m=+2391.110609580" watchObservedRunningTime="2026-02-02 11:18:24.097725987 +0000 UTC m=+2391.116066093" Feb 02 11:18:37 crc kubenswrapper[4901]: I0202 11:18:37.837236 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:18:37 crc kubenswrapper[4901]: I0202 11:18:37.838010 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:18:37 crc kubenswrapper[4901]: I0202 11:18:37.838082 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:18:37 crc kubenswrapper[4901]: I0202 11:18:37.838987 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:18:37 crc kubenswrapper[4901]: I0202 11:18:37.839053 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" gracePeriod=600 Feb 02 11:18:37 crc kubenswrapper[4901]: E0202 11:18:37.983751 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:18:38 crc kubenswrapper[4901]: I0202 11:18:38.225345 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" exitCode=0 Feb 02 11:18:38 crc kubenswrapper[4901]: I0202 11:18:38.225455 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b"} Feb 02 11:18:38 crc kubenswrapper[4901]: I0202 11:18:38.225855 4901 scope.go:117] "RemoveContainer" containerID="d821fb60275f1396819dc073d4089937d058e706c3f293a4c0eb260496800e67" Feb 02 11:18:38 crc kubenswrapper[4901]: I0202 11:18:38.226811 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:18:38 crc kubenswrapper[4901]: E0202 11:18:38.227260 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:18:52 crc kubenswrapper[4901]: I0202 11:18:52.678137 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:18:52 crc kubenswrapper[4901]: E0202 11:18:52.679899 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:19:03 crc kubenswrapper[4901]: I0202 11:19:03.685608 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:19:03 crc kubenswrapper[4901]: E0202 11:19:03.686886 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:19:16 crc kubenswrapper[4901]: I0202 11:19:16.677434 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:19:16 crc kubenswrapper[4901]: E0202 11:19:16.678714 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:19:30 crc kubenswrapper[4901]: I0202 11:19:30.678200 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:19:30 crc kubenswrapper[4901]: E0202 11:19:30.679355 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:19:43 crc kubenswrapper[4901]: I0202 11:19:43.684684 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:19:43 crc kubenswrapper[4901]: E0202 11:19:43.685886 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:19:58 crc kubenswrapper[4901]: I0202 11:19:58.677916 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:19:58 crc kubenswrapper[4901]: E0202 11:19:58.679477 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:20:13 crc kubenswrapper[4901]: I0202 11:20:13.685201 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:20:13 crc kubenswrapper[4901]: E0202 11:20:13.686210 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:20:24 crc kubenswrapper[4901]: I0202 11:20:24.677201 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:20:24 crc kubenswrapper[4901]: E0202 11:20:24.678345 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:20:36 crc kubenswrapper[4901]: I0202 11:20:36.044370 4901 generic.go:334] "Generic (PLEG): container finished" podID="9bca60a0-6b7b-4513-bec4-c2aa578b5607" containerID="3157217286b1aa10c141b51bff70f08bdeb37c4b043d65d783ad78ca35ae431e" exitCode=0 Feb 02 11:20:36 crc kubenswrapper[4901]: I0202 11:20:36.044433 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" event={"ID":"9bca60a0-6b7b-4513-bec4-c2aa578b5607","Type":"ContainerDied","Data":"3157217286b1aa10c141b51bff70f08bdeb37c4b043d65d783ad78ca35ae431e"} Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.512791 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690257 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690305 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690329 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690389 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t8wk\" (UniqueName: \"kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690410 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690472 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690508 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690525 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.690593 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1\") pod \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\" (UID: \"9bca60a0-6b7b-4513-bec4-c2aa578b5607\") " Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.706509 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.709706 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk" (OuterVolumeSpecName: "kube-api-access-8t8wk") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "kube-api-access-8t8wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.727939 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.729414 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.730324 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.734440 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.743384 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.744215 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory" (OuterVolumeSpecName: "inventory") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.745402 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9bca60a0-6b7b-4513-bec4-c2aa578b5607" (UID: "9bca60a0-6b7b-4513-bec4-c2aa578b5607"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793184 4901 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793232 4901 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793243 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793254 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t8wk\" (UniqueName: \"kubernetes.io/projected/9bca60a0-6b7b-4513-bec4-c2aa578b5607-kube-api-access-8t8wk\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793265 4901 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793274 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793284 4901 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793294 4901 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:37 crc kubenswrapper[4901]: I0202 11:20:37.793304 4901 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9bca60a0-6b7b-4513-bec4-c2aa578b5607-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.067597 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" event={"ID":"9bca60a0-6b7b-4513-bec4-c2aa578b5607","Type":"ContainerDied","Data":"a673426ee283a0a47e4252587d974229d921057bb58acc227f4764686724b5d9"} Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.067661 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-g4dt6" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.067666 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a673426ee283a0a47e4252587d974229d921057bb58acc227f4764686724b5d9" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.179192 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7"] Feb 02 11:20:38 crc kubenswrapper[4901]: E0202 11:20:38.180092 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bca60a0-6b7b-4513-bec4-c2aa578b5607" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.180124 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bca60a0-6b7b-4513-bec4-c2aa578b5607" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.180412 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bca60a0-6b7b-4513-bec4-c2aa578b5607" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.181769 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.188997 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.189284 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.197498 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kblkj" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.197931 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.198120 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.203329 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7"] Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.302946 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.303316 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.303473 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.303621 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.303743 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.304319 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.304434 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6vv\" (UniqueName: \"kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.407685 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.407770 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.407806 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.407851 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.407911 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.408010 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.408039 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6vv\" (UniqueName: \"kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.414864 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.417868 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.418958 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.423432 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.433631 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6vv\" (UniqueName: \"kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.433745 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.442834 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.512580 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:20:38 crc kubenswrapper[4901]: I0202 11:20:38.676849 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:20:38 crc kubenswrapper[4901]: E0202 11:20:38.677107 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:20:39 crc kubenswrapper[4901]: I0202 11:20:39.052597 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7"] Feb 02 11:20:39 crc kubenswrapper[4901]: I0202 11:20:39.061443 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:20:39 crc kubenswrapper[4901]: I0202 11:20:39.076196 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" event={"ID":"e2680d00-6499-42bd-ac33-547a56af2392","Type":"ContainerStarted","Data":"959f9233f60b799f602938990a02181414d7a52074eba35070ed2222beaf989c"} Feb 02 11:20:40 crc kubenswrapper[4901]: I0202 11:20:40.108883 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" event={"ID":"e2680d00-6499-42bd-ac33-547a56af2392","Type":"ContainerStarted","Data":"5a1690710cdff11dde4ba9e54b6c9859b850ae23e103d4a04f299a16d3e5f7c5"} Feb 02 11:20:40 crc kubenswrapper[4901]: I0202 11:20:40.139084 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" podStartSLOduration=1.689976009 podStartE2EDuration="2.139057763s" podCreationTimestamp="2026-02-02 11:20:38 +0000 UTC" firstStartedPulling="2026-02-02 11:20:39.06124784 +0000 UTC m=+2526.079587936" lastFinishedPulling="2026-02-02 11:20:39.510329594 +0000 UTC m=+2526.528669690" observedRunningTime="2026-02-02 11:20:40.136320646 +0000 UTC m=+2527.154660752" watchObservedRunningTime="2026-02-02 11:20:40.139057763 +0000 UTC m=+2527.157397859" Feb 02 11:20:51 crc kubenswrapper[4901]: I0202 11:20:51.684002 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:20:51 crc kubenswrapper[4901]: E0202 11:20:51.685053 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:21:03 crc kubenswrapper[4901]: I0202 11:21:03.684867 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:21:03 crc kubenswrapper[4901]: E0202 11:21:03.686194 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:21:15 crc kubenswrapper[4901]: I0202 11:21:15.676711 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:21:15 crc kubenswrapper[4901]: E0202 11:21:15.677804 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:21:28 crc kubenswrapper[4901]: I0202 11:21:28.677369 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:21:28 crc kubenswrapper[4901]: E0202 11:21:28.678289 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:21:42 crc kubenswrapper[4901]: I0202 11:21:42.677529 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:21:42 crc kubenswrapper[4901]: E0202 11:21:42.678874 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:21:55 crc kubenswrapper[4901]: I0202 11:21:55.676984 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:21:55 crc kubenswrapper[4901]: E0202 11:21:55.678170 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:22:06 crc kubenswrapper[4901]: I0202 11:22:06.678145 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:22:06 crc kubenswrapper[4901]: E0202 11:22:06.679293 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:22:21 crc kubenswrapper[4901]: I0202 11:22:21.676960 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:22:21 crc kubenswrapper[4901]: E0202 11:22:21.678053 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:22:35 crc kubenswrapper[4901]: I0202 11:22:35.676650 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:22:35 crc kubenswrapper[4901]: E0202 11:22:35.677727 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:22:49 crc kubenswrapper[4901]: I0202 11:22:49.676646 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:22:49 crc kubenswrapper[4901]: E0202 11:22:49.677530 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:22:57 crc kubenswrapper[4901]: I0202 11:22:57.584942 4901 generic.go:334] "Generic (PLEG): container finished" podID="e2680d00-6499-42bd-ac33-547a56af2392" containerID="5a1690710cdff11dde4ba9e54b6c9859b850ae23e103d4a04f299a16d3e5f7c5" exitCode=0 Feb 02 11:22:57 crc kubenswrapper[4901]: I0202 11:22:57.584983 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" event={"ID":"e2680d00-6499-42bd-ac33-547a56af2392","Type":"ContainerDied","Data":"5a1690710cdff11dde4ba9e54b6c9859b850ae23e103d4a04f299a16d3e5f7c5"} Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.024451 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.202055 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.202527 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.202666 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.202738 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.202809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.203108 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.203151 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g6vv\" (UniqueName: \"kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv\") pod \"e2680d00-6499-42bd-ac33-547a56af2392\" (UID: \"e2680d00-6499-42bd-ac33-547a56af2392\") " Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.214848 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv" (OuterVolumeSpecName: "kube-api-access-2g6vv") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "kube-api-access-2g6vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.225348 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.239631 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.242192 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.243540 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.245004 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory" (OuterVolumeSpecName: "inventory") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.253768 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "e2680d00-6499-42bd-ac33-547a56af2392" (UID: "e2680d00-6499-42bd-ac33-547a56af2392"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306167 4901 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306214 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g6vv\" (UniqueName: \"kubernetes.io/projected/e2680d00-6499-42bd-ac33-547a56af2392-kube-api-access-2g6vv\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306224 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306236 4901 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306246 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306260 4901 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.306269 4901 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2680d00-6499-42bd-ac33-547a56af2392-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.608596 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" event={"ID":"e2680d00-6499-42bd-ac33-547a56af2392","Type":"ContainerDied","Data":"959f9233f60b799f602938990a02181414d7a52074eba35070ed2222beaf989c"} Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.608662 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959f9233f60b799f602938990a02181414d7a52074eba35070ed2222beaf989c" Feb 02 11:22:59 crc kubenswrapper[4901]: I0202 11:22:59.608747 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7" Feb 02 11:23:04 crc kubenswrapper[4901]: I0202 11:23:04.677132 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:23:04 crc kubenswrapper[4901]: E0202 11:23:04.678057 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:23:18 crc kubenswrapper[4901]: I0202 11:23:18.677082 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:23:18 crc kubenswrapper[4901]: E0202 11:23:18.678203 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:23:30 crc kubenswrapper[4901]: I0202 11:23:30.677467 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:23:30 crc kubenswrapper[4901]: E0202 11:23:30.678373 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:23:44 crc kubenswrapper[4901]: I0202 11:23:44.676730 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:23:45 crc kubenswrapper[4901]: I0202 11:23:45.098827 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f"} Feb 02 11:25:56 crc kubenswrapper[4901]: I0202 11:25:56.564464 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.583223 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.584030 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="707985af-5416-42c1-9fbf-866955d8d1c4" containerName="openstackclient" containerID="cri-o://d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b" gracePeriod=2 Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.593996 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.626061 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 11:25:58 crc kubenswrapper[4901]: E0202 11:25:58.627998 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2680d00-6499-42bd-ac33-547a56af2392" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.628025 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2680d00-6499-42bd-ac33-547a56af2392" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 02 11:25:58 crc kubenswrapper[4901]: E0202 11:25:58.628051 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707985af-5416-42c1-9fbf-866955d8d1c4" containerName="openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.628064 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="707985af-5416-42c1-9fbf-866955d8d1c4" containerName="openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.628357 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="707985af-5416-42c1-9fbf-866955d8d1c4" containerName="openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.628408 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2680d00-6499-42bd-ac33-547a56af2392" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.629508 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.637916 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="707985af-5416-42c1-9fbf-866955d8d1c4" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.645301 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.758858 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.759269 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.759414 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ctt\" (UniqueName: \"kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.759962 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.863298 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.863439 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ctt\" (UniqueName: \"kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.863893 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.864193 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.866063 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.873682 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.878038 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.892733 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ctt\" (UniqueName: \"kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt\") pod \"openstackclient\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " pod="openstack/openstackclient" Feb 02 11:25:58 crc kubenswrapper[4901]: I0202 11:25:58.960236 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:25:59 crc kubenswrapper[4901]: I0202 11:25:59.559968 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:25:59.993961 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-2dm7v"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:25:59.997193 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.063958 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-2dm7v"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.101861 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.102025 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdqw\" (UniqueName: \"kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.149876 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.152268 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.181643 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.204117 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdqw\" (UniqueName: \"kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.204251 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.205183 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.233491 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdqw\" (UniqueName: \"kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw\") pod \"aodh-db-create-2dm7v\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.302764 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-5f08-account-create-update-vqqf2"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.304745 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.306682 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.306803 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.306831 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvtnp\" (UniqueName: \"kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.309111 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.316084 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5f08-account-create-update-vqqf2"] Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.342219 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.408634 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v8bs\" (UniqueName: \"kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.408716 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.408765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.408792 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.408813 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvtnp\" (UniqueName: \"kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.409784 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.410384 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.431189 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvtnp\" (UniqueName: \"kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp\") pod \"certified-operators-fvlzg\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.473853 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.512803 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v8bs\" (UniqueName: \"kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.512936 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.517704 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.530491 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ab379047-35d7-4cd8-b64c-bf91cf2e25b7","Type":"ContainerStarted","Data":"6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782"} Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.530636 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ab379047-35d7-4cd8-b64c-bf91cf2e25b7","Type":"ContainerStarted","Data":"a1dbd00f9c6a6f4f86c42ba4fb5f5072fcc6d2a03075e4f72feb1c2c5a65c86c"} Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.544316 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v8bs\" (UniqueName: \"kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs\") pod \"aodh-5f08-account-create-update-vqqf2\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.563754 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.563724755 podStartE2EDuration="2.563724755s" podCreationTimestamp="2026-02-02 11:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:26:00.556438238 +0000 UTC m=+2847.574778334" watchObservedRunningTime="2026-02-02 11:26:00.563724755 +0000 UTC m=+2847.582064851" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.636244 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:00 crc kubenswrapper[4901]: I0202 11:26:00.884221 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-2dm7v"] Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.050692 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.206117 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.328351 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5f08-account-create-update-vqqf2"] Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.338960 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkx2\" (UniqueName: \"kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2\") pod \"707985af-5416-42c1-9fbf-866955d8d1c4\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.340408 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret\") pod \"707985af-5416-42c1-9fbf-866955d8d1c4\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.340650 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle\") pod \"707985af-5416-42c1-9fbf-866955d8d1c4\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.340895 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config\") pod \"707985af-5416-42c1-9fbf-866955d8d1c4\" (UID: \"707985af-5416-42c1-9fbf-866955d8d1c4\") " Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.352402 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2" (OuterVolumeSpecName: "kube-api-access-4tkx2") pod "707985af-5416-42c1-9fbf-866955d8d1c4" (UID: "707985af-5416-42c1-9fbf-866955d8d1c4"). InnerVolumeSpecName "kube-api-access-4tkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.381551 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "707985af-5416-42c1-9fbf-866955d8d1c4" (UID: "707985af-5416-42c1-9fbf-866955d8d1c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.384266 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "707985af-5416-42c1-9fbf-866955d8d1c4" (UID: "707985af-5416-42c1-9fbf-866955d8d1c4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.422279 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "707985af-5416-42c1-9fbf-866955d8d1c4" (UID: "707985af-5416-42c1-9fbf-866955d8d1c4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.443394 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.443435 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkx2\" (UniqueName: \"kubernetes.io/projected/707985af-5416-42c1-9fbf-866955d8d1c4-kube-api-access-4tkx2\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.443448 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.443457 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/707985af-5416-42c1-9fbf-866955d8d1c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.541619 4901 generic.go:334] "Generic (PLEG): container finished" podID="707985af-5416-42c1-9fbf-866955d8d1c4" containerID="d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b" exitCode=137 Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.541738 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.541770 4901 scope.go:117] "RemoveContainer" containerID="d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.543558 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2dm7v" event={"ID":"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7","Type":"ContainerStarted","Data":"7b735cb965c2c2a302d6cffa9926c02680df87dd790adce766a7bb7b60391d36"} Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.544297 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2dm7v" event={"ID":"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7","Type":"ContainerStarted","Data":"7b83183712f60556ddb4b20c0fca66a2614a6d303d5f5660bbb64cb752a038a0"} Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.546671 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerID="256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714" exitCode=0 Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.546770 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerDied","Data":"256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714"} Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.546822 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerStarted","Data":"2d9cb02dce4e9a75bdba38386d8d57cd142bce3f7edff258a69142a52f99f3af"} Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.548662 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.553664 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5f08-account-create-update-vqqf2" event={"ID":"c8b241db-5cd0-4121-a48e-64875cfcf4f0","Type":"ContainerStarted","Data":"bd38684a8c305c2603d3131b34fade3c0bd8d99da3d1fd12b1f7ff625590af40"} Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.566907 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="707985af-5416-42c1-9fbf-866955d8d1c4" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.568412 4901 scope.go:117] "RemoveContainer" containerID="d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b" Feb 02 11:26:01 crc kubenswrapper[4901]: E0202 11:26:01.569457 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b\": container with ID starting with d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b not found: ID does not exist" containerID="d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.569510 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b"} err="failed to get container status \"d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b\": rpc error: code = NotFound desc = could not find container \"d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b\": container with ID starting with d89cb421225e91a6f0382185c78f3723f4c99cb8051531176e1ef9318a5f949b not found: ID does not exist" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.570766 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-2dm7v" podStartSLOduration=2.570743525 podStartE2EDuration="2.570743525s" podCreationTimestamp="2026-02-02 11:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:26:01.564078804 +0000 UTC m=+2848.582418890" watchObservedRunningTime="2026-02-02 11:26:01.570743525 +0000 UTC m=+2848.589083621" Feb 02 11:26:01 crc kubenswrapper[4901]: I0202 11:26:01.694086 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707985af-5416-42c1-9fbf-866955d8d1c4" path="/var/lib/kubelet/pods/707985af-5416-42c1-9fbf-866955d8d1c4/volumes" Feb 02 11:26:02 crc kubenswrapper[4901]: I0202 11:26:02.568620 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerStarted","Data":"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c"} Feb 02 11:26:02 crc kubenswrapper[4901]: I0202 11:26:02.579049 4901 generic.go:334] "Generic (PLEG): container finished" podID="c8b241db-5cd0-4121-a48e-64875cfcf4f0" containerID="31b5639577ae218f63ce0e432b979a4f8045d9ecbc0f7cb5c873fc7114c15bcb" exitCode=0 Feb 02 11:26:02 crc kubenswrapper[4901]: I0202 11:26:02.579221 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5f08-account-create-update-vqqf2" event={"ID":"c8b241db-5cd0-4121-a48e-64875cfcf4f0","Type":"ContainerDied","Data":"31b5639577ae218f63ce0e432b979a4f8045d9ecbc0f7cb5c873fc7114c15bcb"} Feb 02 11:26:02 crc kubenswrapper[4901]: I0202 11:26:02.594387 4901 generic.go:334] "Generic (PLEG): container finished" podID="0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" containerID="7b735cb965c2c2a302d6cffa9926c02680df87dd790adce766a7bb7b60391d36" exitCode=0 Feb 02 11:26:02 crc kubenswrapper[4901]: I0202 11:26:02.594489 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2dm7v" event={"ID":"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7","Type":"ContainerDied","Data":"7b735cb965c2c2a302d6cffa9926c02680df87dd790adce766a7bb7b60391d36"} Feb 02 11:26:03 crc kubenswrapper[4901]: I0202 11:26:03.604756 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerID="0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c" exitCode=0 Feb 02 11:26:03 crc kubenswrapper[4901]: I0202 11:26:03.604881 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerDied","Data":"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c"} Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.018824 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.026154 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.114121 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts\") pod \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.114383 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts\") pod \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.114594 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdqw\" (UniqueName: \"kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw\") pod \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\" (UID: \"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7\") " Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.114865 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v8bs\" (UniqueName: \"kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs\") pod \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\" (UID: \"c8b241db-5cd0-4121-a48e-64875cfcf4f0\") " Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.115415 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" (UID: "0c9b075b-311e-4ff0-a0c3-24e65adb3cf7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.115671 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8b241db-5cd0-4121-a48e-64875cfcf4f0" (UID: "c8b241db-5cd0-4121-a48e-64875cfcf4f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.124130 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs" (OuterVolumeSpecName: "kube-api-access-2v8bs") pod "c8b241db-5cd0-4121-a48e-64875cfcf4f0" (UID: "c8b241db-5cd0-4121-a48e-64875cfcf4f0"). InnerVolumeSpecName "kube-api-access-2v8bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.124360 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw" (OuterVolumeSpecName: "kube-api-access-sbdqw") pod "0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" (UID: "0c9b075b-311e-4ff0-a0c3-24e65adb3cf7"). InnerVolumeSpecName "kube-api-access-sbdqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.217462 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbdqw\" (UniqueName: \"kubernetes.io/projected/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-kube-api-access-sbdqw\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.217513 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v8bs\" (UniqueName: \"kubernetes.io/projected/c8b241db-5cd0-4121-a48e-64875cfcf4f0-kube-api-access-2v8bs\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.217531 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.217544 4901 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b241db-5cd0-4121-a48e-64875cfcf4f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.616911 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-2dm7v" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.617408 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-2dm7v" event={"ID":"0c9b075b-311e-4ff0-a0c3-24e65adb3cf7","Type":"ContainerDied","Data":"7b83183712f60556ddb4b20c0fca66a2614a6d303d5f5660bbb64cb752a038a0"} Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.617468 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b83183712f60556ddb4b20c0fca66a2614a6d303d5f5660bbb64cb752a038a0" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.621615 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerStarted","Data":"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12"} Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.626481 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5f08-account-create-update-vqqf2" event={"ID":"c8b241db-5cd0-4121-a48e-64875cfcf4f0","Type":"ContainerDied","Data":"bd38684a8c305c2603d3131b34fade3c0bd8d99da3d1fd12b1f7ff625590af40"} Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.626531 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd38684a8c305c2603d3131b34fade3c0bd8d99da3d1fd12b1f7ff625590af40" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.626619 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5f08-account-create-update-vqqf2" Feb 02 11:26:04 crc kubenswrapper[4901]: I0202 11:26:04.650158 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fvlzg" podStartSLOduration=2.145304104 podStartE2EDuration="4.65013301s" podCreationTimestamp="2026-02-02 11:26:00 +0000 UTC" firstStartedPulling="2026-02-02 11:26:01.548324071 +0000 UTC m=+2848.566664167" lastFinishedPulling="2026-02-02 11:26:04.053152977 +0000 UTC m=+2851.071493073" observedRunningTime="2026-02-02 11:26:04.643846747 +0000 UTC m=+2851.662186843" watchObservedRunningTime="2026-02-02 11:26:04.65013301 +0000 UTC m=+2851.668473106" Feb 02 11:26:07 crc kubenswrapper[4901]: I0202 11:26:07.838096 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:26:07 crc kubenswrapper[4901]: I0202 11:26:07.838691 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:26:10 crc kubenswrapper[4901]: I0202 11:26:10.475104 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:10 crc kubenswrapper[4901]: I0202 11:26:10.475831 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:10 crc kubenswrapper[4901]: I0202 11:26:10.527214 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:10 crc kubenswrapper[4901]: I0202 11:26:10.746475 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:10 crc kubenswrapper[4901]: I0202 11:26:10.806210 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:12 crc kubenswrapper[4901]: I0202 11:26:12.714370 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fvlzg" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="registry-server" containerID="cri-o://6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12" gracePeriod=2 Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.182254 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.335510 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content\") pod \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.335667 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvtnp\" (UniqueName: \"kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp\") pod \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.335809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities\") pod \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\" (UID: \"ea33ab9d-d4b9-4609-9366-56d1b9897a7e\") " Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.336842 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities" (OuterVolumeSpecName: "utilities") pod "ea33ab9d-d4b9-4609-9366-56d1b9897a7e" (UID: "ea33ab9d-d4b9-4609-9366-56d1b9897a7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.344874 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp" (OuterVolumeSpecName: "kube-api-access-wvtnp") pod "ea33ab9d-d4b9-4609-9366-56d1b9897a7e" (UID: "ea33ab9d-d4b9-4609-9366-56d1b9897a7e"). InnerVolumeSpecName "kube-api-access-wvtnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.439053 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvtnp\" (UniqueName: \"kubernetes.io/projected/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-kube-api-access-wvtnp\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.439124 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.457664 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea33ab9d-d4b9-4609-9366-56d1b9897a7e" (UID: "ea33ab9d-d4b9-4609-9366-56d1b9897a7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.541022 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea33ab9d-d4b9-4609-9366-56d1b9897a7e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.726653 4901 generic.go:334] "Generic (PLEG): container finished" podID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerID="6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12" exitCode=0 Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.726716 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerDied","Data":"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12"} Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.726733 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvlzg" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.726765 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvlzg" event={"ID":"ea33ab9d-d4b9-4609-9366-56d1b9897a7e","Type":"ContainerDied","Data":"2d9cb02dce4e9a75bdba38386d8d57cd142bce3f7edff258a69142a52f99f3af"} Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.726788 4901 scope.go:117] "RemoveContainer" containerID="6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.754431 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.756121 4901 scope.go:117] "RemoveContainer" containerID="0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.763995 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fvlzg"] Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.777447 4901 scope.go:117] "RemoveContainer" containerID="256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.818886 4901 scope.go:117] "RemoveContainer" containerID="6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12" Feb 02 11:26:13 crc kubenswrapper[4901]: E0202 11:26:13.819588 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12\": container with ID starting with 6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12 not found: ID does not exist" containerID="6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.819631 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12"} err="failed to get container status \"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12\": rpc error: code = NotFound desc = could not find container \"6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12\": container with ID starting with 6b9401ac7b008b04062d149c998b33baa64396959808fe7fa8f17ef78617af12 not found: ID does not exist" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.819658 4901 scope.go:117] "RemoveContainer" containerID="0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c" Feb 02 11:26:13 crc kubenswrapper[4901]: E0202 11:26:13.820212 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c\": container with ID starting with 0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c not found: ID does not exist" containerID="0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.820252 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c"} err="failed to get container status \"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c\": rpc error: code = NotFound desc = could not find container \"0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c\": container with ID starting with 0ef77cd426b08788203da4e2c89a633937694a55ae4750af42ce57b92bf2906c not found: ID does not exist" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.820279 4901 scope.go:117] "RemoveContainer" containerID="256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714" Feb 02 11:26:13 crc kubenswrapper[4901]: E0202 11:26:13.820747 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714\": container with ID starting with 256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714 not found: ID does not exist" containerID="256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714" Feb 02 11:26:13 crc kubenswrapper[4901]: I0202 11:26:13.820792 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714"} err="failed to get container status \"256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714\": rpc error: code = NotFound desc = could not find container \"256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714\": container with ID starting with 256e412fdb238a128dc29c3b273f23801c523d35c2aaf9137610692bd23d7714 not found: ID does not exist" Feb 02 11:26:15 crc kubenswrapper[4901]: I0202 11:26:15.688358 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" path="/var/lib/kubelet/pods/ea33ab9d-d4b9-4609-9366-56d1b9897a7e/volumes" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.751981 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:21 crc kubenswrapper[4901]: E0202 11:26:21.753177 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" containerName="mariadb-database-create" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753195 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" containerName="mariadb-database-create" Feb 02 11:26:21 crc kubenswrapper[4901]: E0202 11:26:21.753241 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b241db-5cd0-4121-a48e-64875cfcf4f0" containerName="mariadb-account-create-update" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753249 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b241db-5cd0-4121-a48e-64875cfcf4f0" containerName="mariadb-account-create-update" Feb 02 11:26:21 crc kubenswrapper[4901]: E0202 11:26:21.753264 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="extract-utilities" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753271 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="extract-utilities" Feb 02 11:26:21 crc kubenswrapper[4901]: E0202 11:26:21.753280 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="extract-content" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753286 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="extract-content" Feb 02 11:26:21 crc kubenswrapper[4901]: E0202 11:26:21.753304 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="registry-server" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753310 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="registry-server" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753596 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea33ab9d-d4b9-4609-9366-56d1b9897a7e" containerName="registry-server" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753620 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b241db-5cd0-4121-a48e-64875cfcf4f0" containerName="mariadb-account-create-update" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.753632 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" containerName="mariadb-database-create" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.755350 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.764533 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.837463 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.837617 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.837687 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txglj\" (UniqueName: \"kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.940297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txglj\" (UniqueName: \"kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.940595 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.940631 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.941204 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.941242 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:21 crc kubenswrapper[4901]: I0202 11:26:21.961657 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txglj\" (UniqueName: \"kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj\") pod \"redhat-marketplace-pc4bw\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:22 crc kubenswrapper[4901]: I0202 11:26:22.114462 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:22 crc kubenswrapper[4901]: I0202 11:26:22.599308 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:22 crc kubenswrapper[4901]: I0202 11:26:22.837941 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerID="896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0" exitCode=0 Feb 02 11:26:22 crc kubenswrapper[4901]: I0202 11:26:22.838451 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerDied","Data":"896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0"} Feb 02 11:26:22 crc kubenswrapper[4901]: I0202 11:26:22.838487 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerStarted","Data":"0ef49ef86444a23f695cd6e61958f26609e007bc3685ffa9149ecf04eacd397f"} Feb 02 11:26:23 crc kubenswrapper[4901]: I0202 11:26:23.850370 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerID="464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95" exitCode=0 Feb 02 11:26:23 crc kubenswrapper[4901]: I0202 11:26:23.850506 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerDied","Data":"464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95"} Feb 02 11:26:24 crc kubenswrapper[4901]: I0202 11:26:24.863241 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerStarted","Data":"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7"} Feb 02 11:26:24 crc kubenswrapper[4901]: I0202 11:26:24.883485 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pc4bw" podStartSLOduration=2.484296466 podStartE2EDuration="3.883457969s" podCreationTimestamp="2026-02-02 11:26:21 +0000 UTC" firstStartedPulling="2026-02-02 11:26:22.840260312 +0000 UTC m=+2869.858600408" lastFinishedPulling="2026-02-02 11:26:24.239421815 +0000 UTC m=+2871.257761911" observedRunningTime="2026-02-02 11:26:24.881632564 +0000 UTC m=+2871.899972660" watchObservedRunningTime="2026-02-02 11:26:24.883457969 +0000 UTC m=+2871.901798065" Feb 02 11:26:32 crc kubenswrapper[4901]: I0202 11:26:32.115279 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:32 crc kubenswrapper[4901]: I0202 11:26:32.116054 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:32 crc kubenswrapper[4901]: I0202 11:26:32.158198 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:33 crc kubenswrapper[4901]: I0202 11:26:33.001455 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:33 crc kubenswrapper[4901]: I0202 11:26:33.064285 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:34 crc kubenswrapper[4901]: I0202 11:26:34.974526 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pc4bw" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="registry-server" containerID="cri-o://4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7" gracePeriod=2 Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.425880 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.537418 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities\") pod \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.537736 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txglj\" (UniqueName: \"kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj\") pod \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.538771 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content\") pod \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\" (UID: \"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7\") " Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.538836 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities" (OuterVolumeSpecName: "utilities") pod "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" (UID: "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.540475 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.545100 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj" (OuterVolumeSpecName: "kube-api-access-txglj") pod "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" (UID: "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7"). InnerVolumeSpecName "kube-api-access-txglj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.572483 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" (UID: "2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.642237 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txglj\" (UniqueName: \"kubernetes.io/projected/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-kube-api-access-txglj\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.642372 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.989170 4901 generic.go:334] "Generic (PLEG): container finished" podID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerID="4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7" exitCode=0 Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.989235 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerDied","Data":"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7"} Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.989307 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pc4bw" event={"ID":"2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7","Type":"ContainerDied","Data":"0ef49ef86444a23f695cd6e61958f26609e007bc3685ffa9149ecf04eacd397f"} Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.989311 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pc4bw" Feb 02 11:26:35 crc kubenswrapper[4901]: I0202 11:26:35.989337 4901 scope.go:117] "RemoveContainer" containerID="4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.020612 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.026583 4901 scope.go:117] "RemoveContainer" containerID="464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.031401 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pc4bw"] Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.050621 4901 scope.go:117] "RemoveContainer" containerID="896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.097250 4901 scope.go:117] "RemoveContainer" containerID="4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7" Feb 02 11:26:36 crc kubenswrapper[4901]: E0202 11:26:36.098032 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7\": container with ID starting with 4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7 not found: ID does not exist" containerID="4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.098110 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7"} err="failed to get container status \"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7\": rpc error: code = NotFound desc = could not find container \"4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7\": container with ID starting with 4e8348f16526cf6fb370e3dac8787eac0e2317c0a00308c1d884a88353b036d7 not found: ID does not exist" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.098142 4901 scope.go:117] "RemoveContainer" containerID="464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95" Feb 02 11:26:36 crc kubenswrapper[4901]: E0202 11:26:36.098597 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95\": container with ID starting with 464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95 not found: ID does not exist" containerID="464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.098661 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95"} err="failed to get container status \"464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95\": rpc error: code = NotFound desc = could not find container \"464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95\": container with ID starting with 464136da44adbdf01fb4a1e10966e6e84cd4ec652f92894e608e4ba0f68fec95 not found: ID does not exist" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.098693 4901 scope.go:117] "RemoveContainer" containerID="896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0" Feb 02 11:26:36 crc kubenswrapper[4901]: E0202 11:26:36.099257 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0\": container with ID starting with 896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0 not found: ID does not exist" containerID="896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0" Feb 02 11:26:36 crc kubenswrapper[4901]: I0202 11:26:36.099298 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0"} err="failed to get container status \"896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0\": rpc error: code = NotFound desc = could not find container \"896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0\": container with ID starting with 896d1876ad8facc81ab50320820e8e11084a15d4b82165cc1758dba89fe906d0 not found: ID does not exist" Feb 02 11:26:37 crc kubenswrapper[4901]: I0202 11:26:37.690312 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" path="/var/lib/kubelet/pods/2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7/volumes" Feb 02 11:26:37 crc kubenswrapper[4901]: I0202 11:26:37.837665 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:26:37 crc kubenswrapper[4901]: I0202 11:26:37.837740 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:27:07 crc kubenswrapper[4901]: I0202 11:27:07.837212 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:27:07 crc kubenswrapper[4901]: I0202 11:27:07.838200 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:27:07 crc kubenswrapper[4901]: I0202 11:27:07.838272 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:27:07 crc kubenswrapper[4901]: I0202 11:27:07.839265 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:27:07 crc kubenswrapper[4901]: I0202 11:27:07.839341 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f" gracePeriod=600 Feb 02 11:27:08 crc kubenswrapper[4901]: I0202 11:27:08.319045 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f" exitCode=0 Feb 02 11:27:08 crc kubenswrapper[4901]: I0202 11:27:08.319155 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f"} Feb 02 11:27:08 crc kubenswrapper[4901]: I0202 11:27:08.319459 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a"} Feb 02 11:27:08 crc kubenswrapper[4901]: I0202 11:27:08.319487 4901 scope.go:117] "RemoveContainer" containerID="f2318f5b8900b7c5448368eef0323ed7f8ee4fc4ecf7912865ae62c7f0cf9d8b" Feb 02 11:29:37 crc kubenswrapper[4901]: I0202 11:29:37.837523 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:29:37 crc kubenswrapper[4901]: I0202 11:29:37.838401 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.074994 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:44 crc kubenswrapper[4901]: E0202 11:29:44.076373 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="extract-utilities" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.076389 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="extract-utilities" Feb 02 11:29:44 crc kubenswrapper[4901]: E0202 11:29:44.076423 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="extract-content" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.076429 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="extract-content" Feb 02 11:29:44 crc kubenswrapper[4901]: E0202 11:29:44.076443 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="registry-server" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.076450 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="registry-server" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.076688 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b80ac6c-2d97-44e3-ac07-0db6ba65b8c7" containerName="registry-server" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.078361 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.083641 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.092866 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdx8b\" (UniqueName: \"kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.092939 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.093214 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.195413 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.195535 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdx8b\" (UniqueName: \"kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.195621 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.195984 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.196108 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.227576 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdx8b\" (UniqueName: \"kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b\") pod \"community-operators-n6dwk\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.267607 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.270833 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.285552 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.296392 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.296505 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7n2w\" (UniqueName: \"kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.296617 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.398688 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7n2w\" (UniqueName: \"kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.398836 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.398982 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.399480 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.399749 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.409637 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.425065 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7n2w\" (UniqueName: \"kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w\") pod \"redhat-operators-tctnh\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:44 crc kubenswrapper[4901]: I0202 11:29:44.594470 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.032024 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.299775 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.984331 4901 generic.go:334] "Generic (PLEG): container finished" podID="370f499b-b683-43a8-8ee1-668d04b02cba" containerID="b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7" exitCode=0 Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.984442 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerDied","Data":"b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7"} Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.984479 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerStarted","Data":"0173b113c42b1ede167cbd823f6496da79f39d82a37d7d507794907e3eb861ff"} Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.988193 4901 generic.go:334] "Generic (PLEG): container finished" podID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerID="93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298" exitCode=0 Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.988231 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerDied","Data":"93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298"} Feb 02 11:29:45 crc kubenswrapper[4901]: I0202 11:29:45.988253 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerStarted","Data":"de58d7a74a7df906cdaddfa0d87c27bf717f13b13e2e2b37ec1f18a18ffdc775"} Feb 02 11:29:47 crc kubenswrapper[4901]: I0202 11:29:47.000724 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerStarted","Data":"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9"} Feb 02 11:29:48 crc kubenswrapper[4901]: I0202 11:29:48.016765 4901 generic.go:334] "Generic (PLEG): container finished" podID="370f499b-b683-43a8-8ee1-668d04b02cba" containerID="77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9" exitCode=0 Feb 02 11:29:48 crc kubenswrapper[4901]: I0202 11:29:48.016892 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerDied","Data":"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9"} Feb 02 11:29:48 crc kubenswrapper[4901]: I0202 11:29:48.022218 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerStarted","Data":"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac"} Feb 02 11:29:49 crc kubenswrapper[4901]: I0202 11:29:49.036733 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerStarted","Data":"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a"} Feb 02 11:29:49 crc kubenswrapper[4901]: I0202 11:29:49.063775 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6dwk" podStartSLOduration=2.616265284 podStartE2EDuration="5.063745653s" podCreationTimestamp="2026-02-02 11:29:44 +0000 UTC" firstStartedPulling="2026-02-02 11:29:45.986281811 +0000 UTC m=+3073.004621907" lastFinishedPulling="2026-02-02 11:29:48.43376219 +0000 UTC m=+3075.452102276" observedRunningTime="2026-02-02 11:29:49.058058366 +0000 UTC m=+3076.076398462" watchObservedRunningTime="2026-02-02 11:29:49.063745653 +0000 UTC m=+3076.082085749" Feb 02 11:29:50 crc kubenswrapper[4901]: I0202 11:29:50.047700 4901 generic.go:334] "Generic (PLEG): container finished" podID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerID="a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac" exitCode=0 Feb 02 11:29:50 crc kubenswrapper[4901]: I0202 11:29:50.047768 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerDied","Data":"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac"} Feb 02 11:29:51 crc kubenswrapper[4901]: I0202 11:29:51.061427 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerStarted","Data":"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6"} Feb 02 11:29:51 crc kubenswrapper[4901]: I0202 11:29:51.113422 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tctnh" podStartSLOduration=2.42238122 podStartE2EDuration="7.11339864s" podCreationTimestamp="2026-02-02 11:29:44 +0000 UTC" firstStartedPulling="2026-02-02 11:29:45.989660983 +0000 UTC m=+3073.008001069" lastFinishedPulling="2026-02-02 11:29:50.680678393 +0000 UTC m=+3077.699018489" observedRunningTime="2026-02-02 11:29:51.08817778 +0000 UTC m=+3078.106517896" watchObservedRunningTime="2026-02-02 11:29:51.11339864 +0000 UTC m=+3078.131738736" Feb 02 11:29:54 crc kubenswrapper[4901]: I0202 11:29:54.410900 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:54 crc kubenswrapper[4901]: I0202 11:29:54.411242 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:54 crc kubenswrapper[4901]: I0202 11:29:54.468366 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:54 crc kubenswrapper[4901]: I0202 11:29:54.595052 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:54 crc kubenswrapper[4901]: I0202 11:29:54.595746 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:29:55 crc kubenswrapper[4901]: I0202 11:29:55.156924 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:55 crc kubenswrapper[4901]: I0202 11:29:55.642824 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tctnh" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="registry-server" probeResult="failure" output=< Feb 02 11:29:55 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:29:55 crc kubenswrapper[4901]: > Feb 02 11:29:55 crc kubenswrapper[4901]: I0202 11:29:55.862787 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.117423 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6dwk" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="registry-server" containerID="cri-o://9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a" gracePeriod=2 Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.563193 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.704121 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities\") pod \"370f499b-b683-43a8-8ee1-668d04b02cba\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.704334 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content\") pod \"370f499b-b683-43a8-8ee1-668d04b02cba\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.704401 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdx8b\" (UniqueName: \"kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b\") pod \"370f499b-b683-43a8-8ee1-668d04b02cba\" (UID: \"370f499b-b683-43a8-8ee1-668d04b02cba\") " Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.705543 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities" (OuterVolumeSpecName: "utilities") pod "370f499b-b683-43a8-8ee1-668d04b02cba" (UID: "370f499b-b683-43a8-8ee1-668d04b02cba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.712244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b" (OuterVolumeSpecName: "kube-api-access-qdx8b") pod "370f499b-b683-43a8-8ee1-668d04b02cba" (UID: "370f499b-b683-43a8-8ee1-668d04b02cba"). InnerVolumeSpecName "kube-api-access-qdx8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.768622 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "370f499b-b683-43a8-8ee1-668d04b02cba" (UID: "370f499b-b683-43a8-8ee1-668d04b02cba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.808993 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.809259 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/370f499b-b683-43a8-8ee1-668d04b02cba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:57 crc kubenswrapper[4901]: I0202 11:29:57.809273 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdx8b\" (UniqueName: \"kubernetes.io/projected/370f499b-b683-43a8-8ee1-668d04b02cba-kube-api-access-qdx8b\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.128404 4901 generic.go:334] "Generic (PLEG): container finished" podID="370f499b-b683-43a8-8ee1-668d04b02cba" containerID="9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a" exitCode=0 Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.128464 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerDied","Data":"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a"} Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.128498 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6dwk" event={"ID":"370f499b-b683-43a8-8ee1-668d04b02cba","Type":"ContainerDied","Data":"0173b113c42b1ede167cbd823f6496da79f39d82a37d7d507794907e3eb861ff"} Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.128521 4901 scope.go:117] "RemoveContainer" containerID="9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.128528 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6dwk" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.150117 4901 scope.go:117] "RemoveContainer" containerID="77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.172032 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.181044 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6dwk"] Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.204340 4901 scope.go:117] "RemoveContainer" containerID="b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.243605 4901 scope.go:117] "RemoveContainer" containerID="9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a" Feb 02 11:29:58 crc kubenswrapper[4901]: E0202 11:29:58.244406 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a\": container with ID starting with 9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a not found: ID does not exist" containerID="9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.244478 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a"} err="failed to get container status \"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a\": rpc error: code = NotFound desc = could not find container \"9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a\": container with ID starting with 9f896963c9225bc94cbd6f010666d21ee786cac9cf9bc583f2dee17835e2a04a not found: ID does not exist" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.244524 4901 scope.go:117] "RemoveContainer" containerID="77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9" Feb 02 11:29:58 crc kubenswrapper[4901]: E0202 11:29:58.245331 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9\": container with ID starting with 77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9 not found: ID does not exist" containerID="77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.245395 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9"} err="failed to get container status \"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9\": rpc error: code = NotFound desc = could not find container \"77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9\": container with ID starting with 77e402633432c32eb24b3fd25731da7a040637732dfe9238bd6b1faf798523a9 not found: ID does not exist" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.245442 4901 scope.go:117] "RemoveContainer" containerID="b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7" Feb 02 11:29:58 crc kubenswrapper[4901]: E0202 11:29:58.245954 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7\": container with ID starting with b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7 not found: ID does not exist" containerID="b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7" Feb 02 11:29:58 crc kubenswrapper[4901]: I0202 11:29:58.245995 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7"} err="failed to get container status \"b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7\": rpc error: code = NotFound desc = could not find container \"b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7\": container with ID starting with b4292a2d1c6c1d7b89322f7c8a3ca0ea942a355bf4029d59642b27e145cd5ec7 not found: ID does not exist" Feb 02 11:29:59 crc kubenswrapper[4901]: I0202 11:29:59.686836 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" path="/var/lib/kubelet/pods/370f499b-b683-43a8-8ee1-668d04b02cba/volumes" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.171781 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst"] Feb 02 11:30:00 crc kubenswrapper[4901]: E0202 11:30:00.172254 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.172280 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4901]: E0202 11:30:00.172305 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.172312 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4901]: E0202 11:30:00.172332 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.172338 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.172526 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="370f499b-b683-43a8-8ee1-668d04b02cba" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.173384 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.176774 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.176961 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.192305 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst"] Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.263556 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.263993 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78b9r\" (UniqueName: \"kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.264097 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.335412 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.367143 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.367297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78b9r\" (UniqueName: \"kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.367348 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.368253 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.377783 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.392654 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78b9r\" (UniqueName: \"kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r\") pod \"collect-profiles-29500530-bhtst\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.501329 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:00 crc kubenswrapper[4901]: I0202 11:30:00.967261 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst"] Feb 02 11:30:01 crc kubenswrapper[4901]: I0202 11:30:01.169095 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" event={"ID":"6d516d48-507b-419d-943c-0b0b66b7a461","Type":"ContainerStarted","Data":"f70225bf725c1e4750c28cb4e3dffe852e7d11bd2f348bf32359f4aba1f3443c"} Feb 02 11:30:01 crc kubenswrapper[4901]: I0202 11:30:01.169624 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" event={"ID":"6d516d48-507b-419d-943c-0b0b66b7a461","Type":"ContainerStarted","Data":"35a312030b391246ca033695bb4d0f55ee3812397dcde87b8a614ab03383f983"} Feb 02 11:30:01 crc kubenswrapper[4901]: I0202 11:30:01.194978 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" podStartSLOduration=1.194948876 podStartE2EDuration="1.194948876s" podCreationTimestamp="2026-02-02 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:30:01.183173861 +0000 UTC m=+3088.201513957" watchObservedRunningTime="2026-02-02 11:30:01.194948876 +0000 UTC m=+3088.213288972" Feb 02 11:30:02 crc kubenswrapper[4901]: I0202 11:30:02.180244 4901 generic.go:334] "Generic (PLEG): container finished" podID="6d516d48-507b-419d-943c-0b0b66b7a461" containerID="f70225bf725c1e4750c28cb4e3dffe852e7d11bd2f348bf32359f4aba1f3443c" exitCode=0 Feb 02 11:30:02 crc kubenswrapper[4901]: I0202 11:30:02.180332 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" event={"ID":"6d516d48-507b-419d-943c-0b0b66b7a461","Type":"ContainerDied","Data":"f70225bf725c1e4750c28cb4e3dffe852e7d11bd2f348bf32359f4aba1f3443c"} Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.541855 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.636794 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume\") pod \"6d516d48-507b-419d-943c-0b0b66b7a461\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.636864 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume\") pod \"6d516d48-507b-419d-943c-0b0b66b7a461\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.636910 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78b9r\" (UniqueName: \"kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r\") pod \"6d516d48-507b-419d-943c-0b0b66b7a461\" (UID: \"6d516d48-507b-419d-943c-0b0b66b7a461\") " Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.637942 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d516d48-507b-419d-943c-0b0b66b7a461" (UID: "6d516d48-507b-419d-943c-0b0b66b7a461"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.638668 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d516d48-507b-419d-943c-0b0b66b7a461-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.643879 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r" (OuterVolumeSpecName: "kube-api-access-78b9r") pod "6d516d48-507b-419d-943c-0b0b66b7a461" (UID: "6d516d48-507b-419d-943c-0b0b66b7a461"). InnerVolumeSpecName "kube-api-access-78b9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.644787 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d516d48-507b-419d-943c-0b0b66b7a461" (UID: "6d516d48-507b-419d-943c-0b0b66b7a461"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.740970 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d516d48-507b-419d-943c-0b0b66b7a461-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4901]: I0202 11:30:03.741222 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78b9r\" (UniqueName: \"kubernetes.io/projected/6d516d48-507b-419d-943c-0b0b66b7a461-kube-api-access-78b9r\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.201030 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" event={"ID":"6d516d48-507b-419d-943c-0b0b66b7a461","Type":"ContainerDied","Data":"35a312030b391246ca033695bb4d0f55ee3812397dcde87b8a614ab03383f983"} Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.201661 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a312030b391246ca033695bb4d0f55ee3812397dcde87b8a614ab03383f983" Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.201304 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-bhtst" Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.266876 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s"] Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.275960 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-4vf9s"] Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.655476 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.723153 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:30:04 crc kubenswrapper[4901]: I0202 11:30:04.906285 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:30:05 crc kubenswrapper[4901]: I0202 11:30:05.691319 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a3c890-d6dc-4c8f-bcca-e0cc0461ea18" path="/var/lib/kubelet/pods/36a3c890-d6dc-4c8f-bcca-e0cc0461ea18/volumes" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.224343 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tctnh" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="registry-server" containerID="cri-o://9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6" gracePeriod=2 Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.709888 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.805949 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities\") pod \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.806818 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content\") pod \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.806854 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities" (OuterVolumeSpecName: "utilities") pod "5dffdac6-bc8b-4bb4-ac19-9492e68cc878" (UID: "5dffdac6-bc8b-4bb4-ac19-9492e68cc878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.807155 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7n2w\" (UniqueName: \"kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w\") pod \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\" (UID: \"5dffdac6-bc8b-4bb4-ac19-9492e68cc878\") " Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.808381 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.815114 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w" (OuterVolumeSpecName: "kube-api-access-c7n2w") pod "5dffdac6-bc8b-4bb4-ac19-9492e68cc878" (UID: "5dffdac6-bc8b-4bb4-ac19-9492e68cc878"). InnerVolumeSpecName "kube-api-access-c7n2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.910664 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7n2w\" (UniqueName: \"kubernetes.io/projected/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-kube-api-access-c7n2w\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:06 crc kubenswrapper[4901]: I0202 11:30:06.964001 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dffdac6-bc8b-4bb4-ac19-9492e68cc878" (UID: "5dffdac6-bc8b-4bb4-ac19-9492e68cc878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.013050 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dffdac6-bc8b-4bb4-ac19-9492e68cc878-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.236075 4901 generic.go:334] "Generic (PLEG): container finished" podID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerID="9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6" exitCode=0 Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.236134 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerDied","Data":"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6"} Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.236178 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tctnh" event={"ID":"5dffdac6-bc8b-4bb4-ac19-9492e68cc878","Type":"ContainerDied","Data":"de58d7a74a7df906cdaddfa0d87c27bf717f13b13e2e2b37ec1f18a18ffdc775"} Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.236215 4901 scope.go:117] "RemoveContainer" containerID="9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.236277 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tctnh" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.268864 4901 scope.go:117] "RemoveContainer" containerID="a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.299581 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.307071 4901 scope.go:117] "RemoveContainer" containerID="93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.311346 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tctnh"] Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.358507 4901 scope.go:117] "RemoveContainer" containerID="9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6" Feb 02 11:30:07 crc kubenswrapper[4901]: E0202 11:30:07.359437 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6\": container with ID starting with 9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6 not found: ID does not exist" containerID="9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.359473 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6"} err="failed to get container status \"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6\": rpc error: code = NotFound desc = could not find container \"9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6\": container with ID starting with 9a8c8537c3ce4cae0d5ab4666474f2ed3d6e6e3c611f918957cd8550c930c6e6 not found: ID does not exist" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.359495 4901 scope.go:117] "RemoveContainer" containerID="a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac" Feb 02 11:30:07 crc kubenswrapper[4901]: E0202 11:30:07.359752 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac\": container with ID starting with a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac not found: ID does not exist" containerID="a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.359791 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac"} err="failed to get container status \"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac\": rpc error: code = NotFound desc = could not find container \"a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac\": container with ID starting with a23344013a7052ceeca85e488837ddbf2695c6d1c053751df7bce5343e982aac not found: ID does not exist" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.359811 4901 scope.go:117] "RemoveContainer" containerID="93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298" Feb 02 11:30:07 crc kubenswrapper[4901]: E0202 11:30:07.360756 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298\": container with ID starting with 93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298 not found: ID does not exist" containerID="93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.360803 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298"} err="failed to get container status \"93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298\": rpc error: code = NotFound desc = could not find container \"93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298\": container with ID starting with 93320bb141f0af928365475d02fa4002af08f052cc316f7f0e0e6781df5b3298 not found: ID does not exist" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.692413 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" path="/var/lib/kubelet/pods/5dffdac6-bc8b-4bb4-ac19-9492e68cc878/volumes" Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.837268 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:30:07 crc kubenswrapper[4901]: I0202 11:30:07.837374 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.159676 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8"] Feb 02 11:30:14 crc kubenswrapper[4901]: E0202 11:30:14.164412 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d516d48-507b-419d-943c-0b0b66b7a461" containerName="collect-profiles" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164460 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d516d48-507b-419d-943c-0b0b66b7a461" containerName="collect-profiles" Feb 02 11:30:14 crc kubenswrapper[4901]: E0202 11:30:14.164485 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="registry-server" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164498 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="registry-server" Feb 02 11:30:14 crc kubenswrapper[4901]: E0202 11:30:14.164524 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="extract-utilities" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164532 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="extract-utilities" Feb 02 11:30:14 crc kubenswrapper[4901]: E0202 11:30:14.164553 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="extract-content" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164574 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="extract-content" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164816 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d516d48-507b-419d-943c-0b0b66b7a461" containerName="collect-profiles" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.164834 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dffdac6-bc8b-4bb4-ac19-9492e68cc878" containerName="registry-server" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.166651 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.169325 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.173488 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8"] Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.269032 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.269141 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcxh\" (UniqueName: \"kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.269196 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.371739 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcxh\" (UniqueName: \"kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.371890 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.372045 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.372943 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.372948 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.396176 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcxh\" (UniqueName: \"kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:14 crc kubenswrapper[4901]: I0202 11:30:14.501451 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:15 crc kubenswrapper[4901]: I0202 11:30:15.015027 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8"] Feb 02 11:30:15 crc kubenswrapper[4901]: I0202 11:30:15.326708 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerStarted","Data":"6dc5a57134dd85ad9106b6260b394c1e9898b23dbcb2e7b3f5e95d0fa560d390"} Feb 02 11:30:15 crc kubenswrapper[4901]: I0202 11:30:15.327222 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerStarted","Data":"b8926b4cfd3564209e845566ca0c6e46e8ba78cfefedbdd045ba8efda663f7fa"} Feb 02 11:30:16 crc kubenswrapper[4901]: I0202 11:30:16.337684 4901 generic.go:334] "Generic (PLEG): container finished" podID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerID="6dc5a57134dd85ad9106b6260b394c1e9898b23dbcb2e7b3f5e95d0fa560d390" exitCode=0 Feb 02 11:30:16 crc kubenswrapper[4901]: I0202 11:30:16.337761 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerDied","Data":"6dc5a57134dd85ad9106b6260b394c1e9898b23dbcb2e7b3f5e95d0fa560d390"} Feb 02 11:30:18 crc kubenswrapper[4901]: I0202 11:30:18.359876 4901 generic.go:334] "Generic (PLEG): container finished" podID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerID="cf27b7356217d7efb5e4a861d8a2e28d670a0b9e83e6fb472d1276fab6426ef9" exitCode=0 Feb 02 11:30:18 crc kubenswrapper[4901]: I0202 11:30:18.359962 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerDied","Data":"cf27b7356217d7efb5e4a861d8a2e28d670a0b9e83e6fb472d1276fab6426ef9"} Feb 02 11:30:19 crc kubenswrapper[4901]: I0202 11:30:19.373379 4901 generic.go:334] "Generic (PLEG): container finished" podID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerID="7e0610c4598f847c0135c30696c85a1fedfeca9af14ff9bdb0b980500eefecd0" exitCode=0 Feb 02 11:30:19 crc kubenswrapper[4901]: I0202 11:30:19.373527 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerDied","Data":"7e0610c4598f847c0135c30696c85a1fedfeca9af14ff9bdb0b980500eefecd0"} Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.725531 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.815806 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util\") pod \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.816030 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxcxh\" (UniqueName: \"kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh\") pod \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.816083 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle\") pod \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\" (UID: \"01f21b27-c12d-47f6-bf3c-9976e9ef968e\") " Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.819667 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle" (OuterVolumeSpecName: "bundle") pod "01f21b27-c12d-47f6-bf3c-9976e9ef968e" (UID: "01f21b27-c12d-47f6-bf3c-9976e9ef968e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.825642 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh" (OuterVolumeSpecName: "kube-api-access-hxcxh") pod "01f21b27-c12d-47f6-bf3c-9976e9ef968e" (UID: "01f21b27-c12d-47f6-bf3c-9976e9ef968e"). InnerVolumeSpecName "kube-api-access-hxcxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.832370 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util" (OuterVolumeSpecName: "util") pod "01f21b27-c12d-47f6-bf3c-9976e9ef968e" (UID: "01f21b27-c12d-47f6-bf3c-9976e9ef968e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.918882 4901 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-util\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.918926 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxcxh\" (UniqueName: \"kubernetes.io/projected/01f21b27-c12d-47f6-bf3c-9976e9ef968e-kube-api-access-hxcxh\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:20 crc kubenswrapper[4901]: I0202 11:30:20.918956 4901 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01f21b27-c12d-47f6-bf3c-9976e9ef968e-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:21 crc kubenswrapper[4901]: I0202 11:30:21.396213 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" event={"ID":"01f21b27-c12d-47f6-bf3c-9976e9ef968e","Type":"ContainerDied","Data":"b8926b4cfd3564209e845566ca0c6e46e8ba78cfefedbdd045ba8efda663f7fa"} Feb 02 11:30:21 crc kubenswrapper[4901]: I0202 11:30:21.396264 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8926b4cfd3564209e845566ca0c6e46e8ba78cfefedbdd045ba8efda663f7fa" Feb 02 11:30:21 crc kubenswrapper[4901]: I0202 11:30:21.396344 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.764978 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8"] Feb 02 11:30:31 crc kubenswrapper[4901]: E0202 11:30:31.766157 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="pull" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.766172 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="pull" Feb 02 11:30:31 crc kubenswrapper[4901]: E0202 11:30:31.766183 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="extract" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.766188 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="extract" Feb 02 11:30:31 crc kubenswrapper[4901]: E0202 11:30:31.766210 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="util" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.766217 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="util" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.766399 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f21b27-c12d-47f6-bf3c-9976e9ef968e" containerName="extract" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.771720 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.777305 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4jw4f" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.777337 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.777640 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.822670 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8"] Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.891128 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56qx\" (UniqueName: \"kubernetes.io/projected/624ec5ba-9a1f-4192-a537-c0cc6c8d5c24-kube-api-access-q56qx\") pod \"obo-prometheus-operator-68bc856cb9-58vm8\" (UID: \"624ec5ba-9a1f-4192-a537-c0cc6c8d5c24\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.928232 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8"] Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.930122 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.935378 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.936922 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-lkffv" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.952631 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb"] Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.953986 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.973021 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8"] Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.993965 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.994062 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.994182 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.994232 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:31 crc kubenswrapper[4901]: I0202 11:30:31.994278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56qx\" (UniqueName: \"kubernetes.io/projected/624ec5ba-9a1f-4192-a537-c0cc6c8d5c24-kube-api-access-q56qx\") pod \"obo-prometheus-operator-68bc856cb9-58vm8\" (UID: \"624ec5ba-9a1f-4192-a537-c0cc6c8d5c24\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.003962 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb"] Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.050681 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56qx\" (UniqueName: \"kubernetes.io/projected/624ec5ba-9a1f-4192-a537-c0cc6c8d5c24-kube-api-access-q56qx\") pod \"obo-prometheus-operator-68bc856cb9-58vm8\" (UID: \"624ec5ba-9a1f-4192-a537-c0cc6c8d5c24\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.096873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.097751 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.098765 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.099187 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.102382 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.103393 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/78d5a33d-6a61-4e38-8b5c-9a8bb8436628-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-dt5d8\" (UID: \"78d5a33d-6a61-4e38-8b5c-9a8bb8436628\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.105157 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.108494 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/650a29c1-9b38-4b85-9104-d56f42d0d2d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57f656956-fhbnb\" (UID: \"650a29c1-9b38-4b85-9104-d56f42d0d2d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.128205 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7f2km"] Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.129845 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.130223 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.154830 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.163191 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gvtz4" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.176973 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7f2km"] Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.254295 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.279236 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.316524 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2899g\" (UniqueName: \"kubernetes.io/projected/2d9b21bd-d1dd-4c42-974f-9aa80352637f-kube-api-access-2899g\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.316627 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d9b21bd-d1dd-4c42-974f-9aa80352637f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.316924 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dqgl6"] Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.318881 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.325200 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6h8p5" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.342754 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dqgl6"] Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.418286 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjplj\" (UniqueName: \"kubernetes.io/projected/169c8594-4455-4fd1-9602-8dabcd5828de-kube-api-access-qjplj\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.419094 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/169c8594-4455-4fd1-9602-8dabcd5828de-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.419222 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2899g\" (UniqueName: \"kubernetes.io/projected/2d9b21bd-d1dd-4c42-974f-9aa80352637f-kube-api-access-2899g\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.419274 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d9b21bd-d1dd-4c42-974f-9aa80352637f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.440355 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d9b21bd-d1dd-4c42-974f-9aa80352637f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.448502 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2899g\" (UniqueName: \"kubernetes.io/projected/2d9b21bd-d1dd-4c42-974f-9aa80352637f-kube-api-access-2899g\") pod \"observability-operator-59bdc8b94-7f2km\" (UID: \"2d9b21bd-d1dd-4c42-974f-9aa80352637f\") " pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.522449 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/169c8594-4455-4fd1-9602-8dabcd5828de-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.522699 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjplj\" (UniqueName: \"kubernetes.io/projected/169c8594-4455-4fd1-9602-8dabcd5828de-kube-api-access-qjplj\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.524326 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/169c8594-4455-4fd1-9602-8dabcd5828de-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.544452 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjplj\" (UniqueName: \"kubernetes.io/projected/169c8594-4455-4fd1-9602-8dabcd5828de-kube-api-access-qjplj\") pod \"perses-operator-5bf474d74f-dqgl6\" (UID: \"169c8594-4455-4fd1-9602-8dabcd5828de\") " pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.608165 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.676342 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:30:32 crc kubenswrapper[4901]: I0202 11:30:32.952066 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8"] Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.056446 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8"] Feb 02 11:30:33 crc kubenswrapper[4901]: W0202 11:30:33.083377 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d5a33d_6a61_4e38_8b5c_9a8bb8436628.slice/crio-82ebae2ed5643d2c9740840ab9fda66324d40d910161362a95eee5565ad1e113 WatchSource:0}: Error finding container 82ebae2ed5643d2c9740840ab9fda66324d40d910161362a95eee5565ad1e113: Status 404 returned error can't find the container with id 82ebae2ed5643d2c9740840ab9fda66324d40d910161362a95eee5565ad1e113 Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.224360 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb"] Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.437999 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dqgl6"] Feb 02 11:30:33 crc kubenswrapper[4901]: W0202 11:30:33.487731 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169c8594_4455_4fd1_9602_8dabcd5828de.slice/crio-69436c692a552151af0c1c30bf69d0166c46c1efeba720d21255aa03f03b85db WatchSource:0}: Error finding container 69436c692a552151af0c1c30bf69d0166c46c1efeba720d21255aa03f03b85db: Status 404 returned error can't find the container with id 69436c692a552151af0c1c30bf69d0166c46c1efeba720d21255aa03f03b85db Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.497473 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-7f2km"] Feb 02 11:30:33 crc kubenswrapper[4901]: W0202 11:30:33.501674 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9b21bd_d1dd_4c42_974f_9aa80352637f.slice/crio-ef3129695006e8913d58263f79c0b62cfcc249e669d9ce4c9df4d38960d3798d WatchSource:0}: Error finding container ef3129695006e8913d58263f79c0b62cfcc249e669d9ce4c9df4d38960d3798d: Status 404 returned error can't find the container with id ef3129695006e8913d58263f79c0b62cfcc249e669d9ce4c9df4d38960d3798d Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.571899 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" event={"ID":"650a29c1-9b38-4b85-9104-d56f42d0d2d9","Type":"ContainerStarted","Data":"274f56f2d74ce7f7221b8a94d9d0f2746b920729db9d8c82a74116b7b2a0058b"} Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.582113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" event={"ID":"2d9b21bd-d1dd-4c42-974f-9aa80352637f","Type":"ContainerStarted","Data":"ef3129695006e8913d58263f79c0b62cfcc249e669d9ce4c9df4d38960d3798d"} Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.584657 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" event={"ID":"169c8594-4455-4fd1-9602-8dabcd5828de","Type":"ContainerStarted","Data":"69436c692a552151af0c1c30bf69d0166c46c1efeba720d21255aa03f03b85db"} Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.588338 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" event={"ID":"624ec5ba-9a1f-4192-a537-c0cc6c8d5c24","Type":"ContainerStarted","Data":"40e52843cda28864f44caa0718af7629d3810fc9fdb47b2376c15973f92c0a0f"} Feb 02 11:30:33 crc kubenswrapper[4901]: I0202 11:30:33.590249 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" event={"ID":"78d5a33d-6a61-4e38-8b5c-9a8bb8436628","Type":"ContainerStarted","Data":"82ebae2ed5643d2c9740840ab9fda66324d40d910161362a95eee5565ad1e113"} Feb 02 11:30:37 crc kubenswrapper[4901]: I0202 11:30:37.837832 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:30:37 crc kubenswrapper[4901]: I0202 11:30:37.838711 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:30:37 crc kubenswrapper[4901]: I0202 11:30:37.838774 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:30:37 crc kubenswrapper[4901]: I0202 11:30:37.839649 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:30:37 crc kubenswrapper[4901]: I0202 11:30:37.839716 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" gracePeriod=600 Feb 02 11:30:38 crc kubenswrapper[4901]: E0202 11:30:38.059840 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:30:38 crc kubenswrapper[4901]: I0202 11:30:38.668183 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" exitCode=0 Feb 02 11:30:38 crc kubenswrapper[4901]: I0202 11:30:38.668281 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a"} Feb 02 11:30:38 crc kubenswrapper[4901]: I0202 11:30:38.668801 4901 scope.go:117] "RemoveContainer" containerID="39f6324e3b109ef91ef62ed5dcfb577f7442c2d5c540e6157611157765dd3e2f" Feb 02 11:30:38 crc kubenswrapper[4901]: I0202 11:30:38.669487 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:30:38 crc kubenswrapper[4901]: E0202 11:30:38.670316 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:30:43 crc kubenswrapper[4901]: I0202 11:30:43.784463 4901 scope.go:117] "RemoveContainer" containerID="afe07750fce0a9d7dac4de0d58069f2e8ad39c24ac336a19f72ff5b8b0cdacef" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.043244 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.044327 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57f656956-fhbnb_openshift-operators(650a29c1-9b38-4b85-9104-d56f42d0d2d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.045489 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" podUID="650a29c1-9b38-4b85-9104-d56f42d0d2d9" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.121617 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.121851 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57f656956-dt5d8_openshift-operators(78d5a33d-6a61-4e38-8b5c-9a8bb8436628): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.123064 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" podUID="78d5a33d-6a61-4e38-8b5c-9a8bb8436628" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.821322 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" podUID="78d5a33d-6a61-4e38-8b5c-9a8bb8436628" Feb 02 11:30:49 crc kubenswrapper[4901]: E0202 11:30:49.821470 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" podUID="650a29c1-9b38-4b85-9104-d56f42d0d2d9" Feb 02 11:30:50 crc kubenswrapper[4901]: E0202 11:30:50.043865 4901 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Feb 02 11:30:50 crc kubenswrapper[4901]: E0202 11:30:50.044143 4901 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjplj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-dqgl6_openshift-operators(169c8594-4455-4fd1-9602-8dabcd5828de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 11:30:50 crc kubenswrapper[4901]: E0202 11:30:50.048630 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" podUID="169c8594-4455-4fd1-9602-8dabcd5828de" Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.832758 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" event={"ID":"2d9b21bd-d1dd-4c42-974f-9aa80352637f","Type":"ContainerStarted","Data":"8a681e0b87b1e4f2c765aead24ca63e66b8f9bafae86d5cf6d09c49831f3cb53"} Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.833493 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.839137 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" event={"ID":"624ec5ba-9a1f-4192-a537-c0cc6c8d5c24","Type":"ContainerStarted","Data":"8ad2266f13eed170bd3c29b70925f7fcd43a30cf7439f25c737d8ed71a0ee1f2"} Feb 02 11:30:50 crc kubenswrapper[4901]: E0202 11:30:50.846968 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" podUID="169c8594-4455-4fd1-9602-8dabcd5828de" Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.862195 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" podStartSLOduration=2.277494038 podStartE2EDuration="18.862176399s" podCreationTimestamp="2026-02-02 11:30:32 +0000 UTC" firstStartedPulling="2026-02-02 11:30:33.51057241 +0000 UTC m=+3120.528912506" lastFinishedPulling="2026-02-02 11:30:50.095254771 +0000 UTC m=+3137.113594867" observedRunningTime="2026-02-02 11:30:50.858410618 +0000 UTC m=+3137.876750714" watchObservedRunningTime="2026-02-02 11:30:50.862176399 +0000 UTC m=+3137.880516515" Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.881925 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-7f2km" Feb 02 11:30:50 crc kubenswrapper[4901]: I0202 11:30:50.920843 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-58vm8" podStartSLOduration=2.8148106349999997 podStartE2EDuration="19.920823459s" podCreationTimestamp="2026-02-02 11:30:31 +0000 UTC" firstStartedPulling="2026-02-02 11:30:32.986802417 +0000 UTC m=+3120.005142513" lastFinishedPulling="2026-02-02 11:30:50.092815241 +0000 UTC m=+3137.111155337" observedRunningTime="2026-02-02 11:30:50.91094577 +0000 UTC m=+3137.929285866" watchObservedRunningTime="2026-02-02 11:30:50.920823459 +0000 UTC m=+3137.939163555" Feb 02 11:30:53 crc kubenswrapper[4901]: I0202 11:30:53.683549 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:30:53 crc kubenswrapper[4901]: E0202 11:30:53.692656 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:30:59 crc kubenswrapper[4901]: I0202 11:30:59.992632 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 11:30:59 crc kubenswrapper[4901]: I0202 11:30:59.995409 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.000635 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.003445 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.005058 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-sgplg" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.005314 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.005424 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.005468 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.057361 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.057859 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn5wc\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-kube-api-access-pn5wc\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.057972 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.058113 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.058203 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.058304 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.058393 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160195 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160296 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn5wc\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-kube-api-access-pn5wc\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160320 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160358 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160380 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160404 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.160425 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.161212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.167850 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.168161 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.168932 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.169552 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b26cf498-bc66-40f0-bc8f-5f89ac251655-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.180921 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b26cf498-bc66-40f0-bc8f-5f89ac251655-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.193356 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn5wc\" (UniqueName: \"kubernetes.io/projected/b26cf498-bc66-40f0-bc8f-5f89ac251655-kube-api-access-pn5wc\") pod \"alertmanager-metric-storage-0\" (UID: \"b26cf498-bc66-40f0-bc8f-5f89ac251655\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.332743 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.494153 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.508384 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.508528 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.514003 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.514171 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.514286 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.514407 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9pb" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.517825 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.534512 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.534692 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.546377 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669307 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669388 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669454 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsv4d\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669531 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669587 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669648 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669694 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669730 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669761 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.669799 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771672 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsv4d\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771750 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771784 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771825 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771851 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771892 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771913 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.771977 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.772000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.772842 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.773063 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.773142 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.773839 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.782458 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.785473 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.793143 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.798633 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.798920 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.803212 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsv4d\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.824105 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.861265 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.876411 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 11:31:00 crc kubenswrapper[4901]: I0202 11:31:00.960721 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b26cf498-bc66-40f0-bc8f-5f89ac251655","Type":"ContainerStarted","Data":"d195c583f9b8888368341c3590d3893c6abb19377411a1d869d841904a96fede"} Feb 02 11:31:01 crc kubenswrapper[4901]: I0202 11:31:01.461127 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:01 crc kubenswrapper[4901]: I0202 11:31:01.973198 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerStarted","Data":"537af98a50387bb58442d8770e9adf933011c82b84238c34c0d6dc960c8aa17f"} Feb 02 11:31:04 crc kubenswrapper[4901]: I0202 11:31:04.680743 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:31:06 crc kubenswrapper[4901]: I0202 11:31:06.022180 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" event={"ID":"650a29c1-9b38-4b85-9104-d56f42d0d2d9","Type":"ContainerStarted","Data":"c62c97d2aa2be7bd9023e9f52c49b65893803d215d6af5dd58dc24fd8713a030"} Feb 02 11:31:06 crc kubenswrapper[4901]: I0202 11:31:06.071705 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-fhbnb" podStartSLOduration=2.913459844 podStartE2EDuration="35.071678385s" podCreationTimestamp="2026-02-02 11:30:31 +0000 UTC" firstStartedPulling="2026-02-02 11:30:33.233727866 +0000 UTC m=+3120.252067962" lastFinishedPulling="2026-02-02 11:31:05.391946407 +0000 UTC m=+3152.410286503" observedRunningTime="2026-02-02 11:31:06.06113602 +0000 UTC m=+3153.079476136" watchObservedRunningTime="2026-02-02 11:31:06.071678385 +0000 UTC m=+3153.090018481" Feb 02 11:31:06 crc kubenswrapper[4901]: I0202 11:31:06.678244 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:31:06 crc kubenswrapper[4901]: E0202 11:31:06.679134 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:31:07 crc kubenswrapper[4901]: I0202 11:31:07.032592 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" event={"ID":"169c8594-4455-4fd1-9602-8dabcd5828de","Type":"ContainerStarted","Data":"8a899f4dd5a7effbd7bf4d980633c05915e5be0c10e56022765781ea2286b924"} Feb 02 11:31:07 crc kubenswrapper[4901]: I0202 11:31:07.033397 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:31:07 crc kubenswrapper[4901]: I0202 11:31:07.034476 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" event={"ID":"78d5a33d-6a61-4e38-8b5c-9a8bb8436628","Type":"ContainerStarted","Data":"7379fa1498eb063bfc475997603a7ec9773eda223b4c47420c902e8a8b0092a1"} Feb 02 11:31:07 crc kubenswrapper[4901]: I0202 11:31:07.057524 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" podStartSLOduration=2.688420097 podStartE2EDuration="35.057503353s" podCreationTimestamp="2026-02-02 11:30:32 +0000 UTC" firstStartedPulling="2026-02-02 11:30:33.49737781 +0000 UTC m=+3120.515717906" lastFinishedPulling="2026-02-02 11:31:05.866461046 +0000 UTC m=+3152.884801162" observedRunningTime="2026-02-02 11:31:07.048099836 +0000 UTC m=+3154.066439932" watchObservedRunningTime="2026-02-02 11:31:07.057503353 +0000 UTC m=+3154.075843449" Feb 02 11:31:07 crc kubenswrapper[4901]: I0202 11:31:07.077280 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57f656956-dt5d8" podStartSLOduration=3.259091992 podStartE2EDuration="36.077256542s" podCreationTimestamp="2026-02-02 11:30:31 +0000 UTC" firstStartedPulling="2026-02-02 11:30:33.092142908 +0000 UTC m=+3120.110483004" lastFinishedPulling="2026-02-02 11:31:05.910307468 +0000 UTC m=+3152.928647554" observedRunningTime="2026-02-02 11:31:07.067423544 +0000 UTC m=+3154.085763640" watchObservedRunningTime="2026-02-02 11:31:07.077256542 +0000 UTC m=+3154.095596638" Feb 02 11:31:08 crc kubenswrapper[4901]: I0202 11:31:08.044964 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b26cf498-bc66-40f0-bc8f-5f89ac251655","Type":"ContainerStarted","Data":"f4f0346954b01ccfbcdb84c0d96cf46bbb085ae5c6e36f94697b22df59367b4f"} Feb 02 11:31:09 crc kubenswrapper[4901]: I0202 11:31:09.067798 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerStarted","Data":"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a"} Feb 02 11:31:12 crc kubenswrapper[4901]: I0202 11:31:12.680161 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dqgl6" Feb 02 11:31:14 crc kubenswrapper[4901]: I0202 11:31:14.129174 4901 generic.go:334] "Generic (PLEG): container finished" podID="b26cf498-bc66-40f0-bc8f-5f89ac251655" containerID="f4f0346954b01ccfbcdb84c0d96cf46bbb085ae5c6e36f94697b22df59367b4f" exitCode=0 Feb 02 11:31:14 crc kubenswrapper[4901]: I0202 11:31:14.129339 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b26cf498-bc66-40f0-bc8f-5f89ac251655","Type":"ContainerDied","Data":"f4f0346954b01ccfbcdb84c0d96cf46bbb085ae5c6e36f94697b22df59367b4f"} Feb 02 11:31:15 crc kubenswrapper[4901]: I0202 11:31:15.140772 4901 generic.go:334] "Generic (PLEG): container finished" podID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerID="f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a" exitCode=0 Feb 02 11:31:15 crc kubenswrapper[4901]: I0202 11:31:15.140876 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerDied","Data":"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a"} Feb 02 11:31:19 crc kubenswrapper[4901]: I0202 11:31:19.207856 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b26cf498-bc66-40f0-bc8f-5f89ac251655","Type":"ContainerStarted","Data":"e846181ec557693071994fdf61d48dc1e6129688d232957e7de913a8b16ed438"} Feb 02 11:31:21 crc kubenswrapper[4901]: I0202 11:31:21.678351 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:31:21 crc kubenswrapper[4901]: E0202 11:31:21.678996 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:31:23 crc kubenswrapper[4901]: I0202 11:31:23.275322 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b26cf498-bc66-40f0-bc8f-5f89ac251655","Type":"ContainerStarted","Data":"c9d3c34567225f80361a61c49c898d0b17eae26a0ce98bb8eec398a808e19801"} Feb 02 11:31:23 crc kubenswrapper[4901]: I0202 11:31:23.277430 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:23 crc kubenswrapper[4901]: I0202 11:31:23.280844 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 02 11:31:23 crc kubenswrapper[4901]: I0202 11:31:23.300986 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerStarted","Data":"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90"} Feb 02 11:31:23 crc kubenswrapper[4901]: I0202 11:31:23.321368 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.489970491 podStartE2EDuration="24.321349558s" podCreationTimestamp="2026-02-02 11:30:59 +0000 UTC" firstStartedPulling="2026-02-02 11:31:00.900262234 +0000 UTC m=+3147.918602330" lastFinishedPulling="2026-02-02 11:31:18.731641301 +0000 UTC m=+3165.749981397" observedRunningTime="2026-02-02 11:31:23.311321586 +0000 UTC m=+3170.329661682" watchObservedRunningTime="2026-02-02 11:31:23.321349558 +0000 UTC m=+3170.339689654" Feb 02 11:31:27 crc kubenswrapper[4901]: I0202 11:31:27.339818 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerStarted","Data":"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca"} Feb 02 11:31:31 crc kubenswrapper[4901]: I0202 11:31:31.391321 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerStarted","Data":"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22"} Feb 02 11:31:31 crc kubenswrapper[4901]: I0202 11:31:31.418037 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.63309521 podStartE2EDuration="32.418007807s" podCreationTimestamp="2026-02-02 11:30:59 +0000 UTC" firstStartedPulling="2026-02-02 11:31:01.466210456 +0000 UTC m=+3148.484550552" lastFinishedPulling="2026-02-02 11:31:30.251123053 +0000 UTC m=+3177.269463149" observedRunningTime="2026-02-02 11:31:31.41646312 +0000 UTC m=+3178.434803216" watchObservedRunningTime="2026-02-02 11:31:31.418007807 +0000 UTC m=+3178.436347913" Feb 02 11:31:34 crc kubenswrapper[4901]: I0202 11:31:34.676758 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:31:34 crc kubenswrapper[4901]: E0202 11:31:34.677361 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:31:35 crc kubenswrapper[4901]: I0202 11:31:35.862402 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:45 crc kubenswrapper[4901]: I0202 11:31:45.862398 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:45 crc kubenswrapper[4901]: I0202 11:31:45.866009 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:46 crc kubenswrapper[4901]: I0202 11:31:46.535773 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.930470 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.931330 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" containerName="openstackclient" containerID="cri-o://6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782" gracePeriod=2 Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.948625 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.969161 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:47 crc kubenswrapper[4901]: E0202 11:31:47.969799 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" containerName="openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.969825 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" containerName="openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.970095 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" containerName="openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.970919 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.974896 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" podUID="be8317a4-6e42-464f-9f80-10fa084c1d68" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.989249 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.989331 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.989440 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.989648 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwlt\" (UniqueName: \"kubernetes.io/projected/be8317a4-6e42-464f-9f80-10fa084c1d68-kube-api-access-5wwlt\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:47 crc kubenswrapper[4901]: I0202 11:31:47.992626 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.031678 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:48 crc kubenswrapper[4901]: E0202 11:31:48.032743 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-5wwlt openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="be8317a4-6e42-464f-9f80-10fa084c1d68" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.044371 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.058945 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.060881 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.070241 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.098871 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwlt\" (UniqueName: \"kubernetes.io/projected/be8317a4-6e42-464f-9f80-10fa084c1d68-kube-api-access-5wwlt\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.098970 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92kxp\" (UniqueName: \"kubernetes.io/projected/b0d18719-d65d-4624-9696-b876ab4b3e85-kube-api-access-92kxp\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099102 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099169 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099217 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099324 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099409 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.099540 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.103417 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: E0202 11:31:48.108662 4901 projected.go:194] Error preparing data for projected volume kube-api-access-5wwlt for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (be8317a4-6e42-464f-9f80-10fa084c1d68) does not match the UID in record. The object might have been deleted and then recreated Feb 02 11:31:48 crc kubenswrapper[4901]: E0202 11:31:48.108767 4901 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be8317a4-6e42-464f-9f80-10fa084c1d68-kube-api-access-5wwlt podName:be8317a4-6e42-464f-9f80-10fa084c1d68 nodeName:}" failed. No retries permitted until 2026-02-02 11:31:48.608743107 +0000 UTC m=+3195.627083203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5wwlt" (UniqueName: "kubernetes.io/projected/be8317a4-6e42-464f-9f80-10fa084c1d68-kube-api-access-5wwlt") pod "openstackclient" (UID: "be8317a4-6e42-464f-9f80-10fa084c1d68") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (be8317a4-6e42-464f-9f80-10fa084c1d68) does not match the UID in record. The object might have been deleted and then recreated Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.109285 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.132314 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.202109 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.202306 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92kxp\" (UniqueName: \"kubernetes.io/projected/b0d18719-d65d-4624-9696-b876ab4b3e85-kube-api-access-92kxp\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.202408 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.202510 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.204988 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.206376 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.213205 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b0d18719-d65d-4624-9696-b876ab4b3e85-openstack-config-secret\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.225662 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92kxp\" (UniqueName: \"kubernetes.io/projected/b0d18719-d65d-4624-9696-b876ab4b3e85-kube-api-access-92kxp\") pod \"openstackclient\" (UID: \"b0d18719-d65d-4624-9696-b876ab4b3e85\") " pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.394770 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.557250 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.567505 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="be8317a4-6e42-464f-9f80-10fa084c1d68" podUID="b0d18719-d65d-4624-9696-b876ab4b3e85" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.572611 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.612595 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret\") pod \"be8317a4-6e42-464f-9f80-10fa084c1d68\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.612760 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config\") pod \"be8317a4-6e42-464f-9f80-10fa084c1d68\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.612809 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle\") pod \"be8317a4-6e42-464f-9f80-10fa084c1d68\" (UID: \"be8317a4-6e42-464f-9f80-10fa084c1d68\") " Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.613527 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wwlt\" (UniqueName: \"kubernetes.io/projected/be8317a4-6e42-464f-9f80-10fa084c1d68-kube-api-access-5wwlt\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.613653 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "be8317a4-6e42-464f-9f80-10fa084c1d68" (UID: "be8317a4-6e42-464f-9f80-10fa084c1d68"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.619244 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be8317a4-6e42-464f-9f80-10fa084c1d68" (UID: "be8317a4-6e42-464f-9f80-10fa084c1d68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.620590 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "be8317a4-6e42-464f-9f80-10fa084c1d68" (UID: "be8317a4-6e42-464f-9f80-10fa084c1d68"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.677006 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:31:48 crc kubenswrapper[4901]: E0202 11:31:48.677369 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.717874 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.718249 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be8317a4-6e42-464f-9f80-10fa084c1d68-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:48 crc kubenswrapper[4901]: I0202 11:31:48.718261 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be8317a4-6e42-464f-9f80-10fa084c1d68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.161766 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 11:31:49 crc kubenswrapper[4901]: W0202 11:31:49.170338 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0d18719_d65d_4624_9696_b876ab4b3e85.slice/crio-960fa5122bde115f8bb42628b6f9f640a07a8f768a6090671509ccd0963e966c WatchSource:0}: Error finding container 960fa5122bde115f8bb42628b6f9f640a07a8f768a6090671509ccd0963e966c: Status 404 returned error can't find the container with id 960fa5122bde115f8bb42628b6f9f640a07a8f768a6090671509ccd0963e966c Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.347149 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.348076 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="prometheus" containerID="cri-o://047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90" gracePeriod=600 Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.348454 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="thanos-sidecar" containerID="cri-o://0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22" gracePeriod=600 Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.348525 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="config-reloader" containerID="cri-o://b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca" gracePeriod=600 Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.575167 4901 generic.go:334] "Generic (PLEG): container finished" podID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerID="0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22" exitCode=0 Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.575215 4901 generic.go:334] "Generic (PLEG): container finished" podID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerID="047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90" exitCode=0 Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.575262 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerDied","Data":"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22"} Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.575328 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerDied","Data":"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90"} Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.577273 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b0d18719-d65d-4624-9696-b876ab4b3e85","Type":"ContainerStarted","Data":"c8a546db62b89d3de65a4baa03a7fc17244ba2b0d0954717e71c28102e60f4f5"} Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.577346 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b0d18719-d65d-4624-9696-b876ab4b3e85","Type":"ContainerStarted","Data":"960fa5122bde115f8bb42628b6f9f640a07a8f768a6090671509ccd0963e966c"} Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.577304 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.604377 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.604356299 podStartE2EDuration="1.604356299s" podCreationTimestamp="2026-02-02 11:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:31:49.603209751 +0000 UTC m=+3196.621549847" watchObservedRunningTime="2026-02-02 11:31:49.604356299 +0000 UTC m=+3196.622696395" Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.609842 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="be8317a4-6e42-464f-9f80-10fa084c1d68" podUID="b0d18719-d65d-4624-9696-b876ab4b3e85" Feb 02 11:31:49 crc kubenswrapper[4901]: I0202 11:31:49.694393 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be8317a4-6e42-464f-9f80-10fa084c1d68" path="/var/lib/kubelet/pods/be8317a4-6e42-464f-9f80-10fa084c1d68/volumes" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.185252 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.261866 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.261930 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsv4d\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262037 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262125 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262148 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262267 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262353 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262383 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262410 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.262455 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0\") pod \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\" (UID: \"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.263784 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.264770 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.265381 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.271555 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config" (OuterVolumeSpecName: "config") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.271994 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out" (OuterVolumeSpecName: "config-out") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.274532 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.275926 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d" (OuterVolumeSpecName: "kube-api-access-tsv4d") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "kube-api-access-tsv4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.276056 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.280661 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.303849 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config" (OuterVolumeSpecName: "web-config") pod "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" (UID: "a6911cb1-5431-45cc-8efe-ba2cef8fb5a7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367029 4901 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367082 4901 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367097 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367142 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367154 4901 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367187 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367197 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367209 4901 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367222 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsv4d\" (UniqueName: \"kubernetes.io/projected/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-kube-api-access-tsv4d\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.367232 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.383408 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.399615 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.468626 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7ctt\" (UniqueName: \"kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt\") pod \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.468821 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config\") pod \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.468903 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret\") pod \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.469476 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle\") pod \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\" (UID: \"ab379047-35d7-4cd8-b64c-bf91cf2e25b7\") " Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.470174 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.472952 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt" (OuterVolumeSpecName: "kube-api-access-h7ctt") pod "ab379047-35d7-4cd8-b64c-bf91cf2e25b7" (UID: "ab379047-35d7-4cd8-b64c-bf91cf2e25b7"). InnerVolumeSpecName "kube-api-access-h7ctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.504635 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ab379047-35d7-4cd8-b64c-bf91cf2e25b7" (UID: "ab379047-35d7-4cd8-b64c-bf91cf2e25b7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.528605 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ab379047-35d7-4cd8-b64c-bf91cf2e25b7" (UID: "ab379047-35d7-4cd8-b64c-bf91cf2e25b7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.539424 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab379047-35d7-4cd8-b64c-bf91cf2e25b7" (UID: "ab379047-35d7-4cd8-b64c-bf91cf2e25b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.572713 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.572960 4901 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.572974 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7ctt\" (UniqueName: \"kubernetes.io/projected/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-kube-api-access-h7ctt\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.572985 4901 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab379047-35d7-4cd8-b64c-bf91cf2e25b7-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.588214 4901 generic.go:334] "Generic (PLEG): container finished" podID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerID="b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca" exitCode=0 Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.588276 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerDied","Data":"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca"} Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.588322 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.588338 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a6911cb1-5431-45cc-8efe-ba2cef8fb5a7","Type":"ContainerDied","Data":"537af98a50387bb58442d8770e9adf933011c82b84238c34c0d6dc960c8aa17f"} Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.588361 4901 scope.go:117] "RemoveContainer" containerID="0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.592701 4901 generic.go:334] "Generic (PLEG): container finished" podID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" containerID="6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782" exitCode=137 Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.593945 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.616006 4901 scope.go:117] "RemoveContainer" containerID="b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.638496 4901 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" podUID="b0d18719-d65d-4624-9696-b876ab4b3e85" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.645259 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.648486 4901 scope.go:117] "RemoveContainer" containerID="047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.661285 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.698685 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.699319 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="thanos-sidecar" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699333 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="thanos-sidecar" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.699402 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="prometheus" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699412 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="prometheus" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.699423 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="config-reloader" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699430 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="config-reloader" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.699462 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="init-config-reloader" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699469 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="init-config-reloader" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699708 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="thanos-sidecar" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699720 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="config-reloader" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.699747 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" containerName="prometheus" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.702053 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.705984 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.705997 4901 scope.go:117] "RemoveContainer" containerID="f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.711936 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.715208 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.715213 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.715388 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.715552 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9pb" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.718676 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.718739 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.720842 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.722436 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784677 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784768 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784800 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784825 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784875 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784914 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784943 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784963 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.784985 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.785079 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.785120 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.785145 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9p7l\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.785168 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.809752 4901 scope.go:117] "RemoveContainer" containerID="0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.810328 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22\": container with ID starting with 0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22 not found: ID does not exist" containerID="0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.810377 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22"} err="failed to get container status \"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22\": rpc error: code = NotFound desc = could not find container \"0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22\": container with ID starting with 0eee8131bcb4c21cfac93bfbc5248460678947773bc3ec462fcf9f6656597d22 not found: ID does not exist" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.810408 4901 scope.go:117] "RemoveContainer" containerID="b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.813909 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca\": container with ID starting with b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca not found: ID does not exist" containerID="b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.813949 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca"} err="failed to get container status \"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca\": rpc error: code = NotFound desc = could not find container \"b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca\": container with ID starting with b4a3f3fe695911cc288355743992ecc486d5d617084acb0423e38a635018f8ca not found: ID does not exist" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.813975 4901 scope.go:117] "RemoveContainer" containerID="047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.818812 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90\": container with ID starting with 047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90 not found: ID does not exist" containerID="047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.818883 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90"} err="failed to get container status \"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90\": rpc error: code = NotFound desc = could not find container \"047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90\": container with ID starting with 047f3649c5c5787aa520a73a40f7ce131e67c6a1a52620037727d85174245b90 not found: ID does not exist" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.818924 4901 scope.go:117] "RemoveContainer" containerID="f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.823743 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a\": container with ID starting with f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a not found: ID does not exist" containerID="f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.823790 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a"} err="failed to get container status \"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a\": rpc error: code = NotFound desc = could not find container \"f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a\": container with ID starting with f42e48054d21caeeaf9915cc659a88916ae4726d76a2ecad8b377cc1882fee7a not found: ID does not exist" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.823827 4901 scope.go:117] "RemoveContainer" containerID="6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.881769 4901 scope.go:117] "RemoveContainer" containerID="6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782" Feb 02 11:31:50 crc kubenswrapper[4901]: E0202 11:31:50.885707 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782\": container with ID starting with 6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782 not found: ID does not exist" containerID="6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.885752 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782"} err="failed to get container status \"6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782\": rpc error: code = NotFound desc = could not find container \"6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782\": container with ID starting with 6135bc1c149fd6383bc8d12c9bd90d74123cb7c94090f3308816dc4b52447782 not found: ID does not exist" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.895903 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.895961 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.895993 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896011 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896036 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896101 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896130 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896152 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9p7l\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896177 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896200 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896240 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896263 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.896289 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.898460 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.902773 4901 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.902822 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.902924 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.908933 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.910765 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.911426 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.914014 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.918370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.919120 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.927544 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.946447 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9p7l\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.947339 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:50 crc kubenswrapper[4901]: I0202 11:31:50.977851 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:51 crc kubenswrapper[4901]: I0202 11:31:51.068482 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:31:51 crc kubenswrapper[4901]: W0202 11:31:51.618531 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437ed0bb_bdbf_4b88_b41a_e01830c4bde2.slice/crio-8dd4b8411287975b2f99039c978dc144cdb146dcd3f4a1a3348acb267e0fa74a WatchSource:0}: Error finding container 8dd4b8411287975b2f99039c978dc144cdb146dcd3f4a1a3348acb267e0fa74a: Status 404 returned error can't find the container with id 8dd4b8411287975b2f99039c978dc144cdb146dcd3f4a1a3348acb267e0fa74a Feb 02 11:31:51 crc kubenswrapper[4901]: I0202 11:31:51.620958 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:31:51 crc kubenswrapper[4901]: I0202 11:31:51.694968 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6911cb1-5431-45cc-8efe-ba2cef8fb5a7" path="/var/lib/kubelet/pods/a6911cb1-5431-45cc-8efe-ba2cef8fb5a7/volumes" Feb 02 11:31:51 crc kubenswrapper[4901]: I0202 11:31:51.696448 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab379047-35d7-4cd8-b64c-bf91cf2e25b7" path="/var/lib/kubelet/pods/ab379047-35d7-4cd8-b64c-bf91cf2e25b7/volumes" Feb 02 11:31:52 crc kubenswrapper[4901]: I0202 11:31:52.623181 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerStarted","Data":"8dd4b8411287975b2f99039c978dc144cdb146dcd3f4a1a3348acb267e0fa74a"} Feb 02 11:31:55 crc kubenswrapper[4901]: I0202 11:31:55.652513 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerStarted","Data":"d0c06c406fb1f8be00e67b4e511a85dadcffdad1a8f7fa2cc616b575b7493749"} Feb 02 11:32:01 crc kubenswrapper[4901]: I0202 11:32:01.677984 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:32:01 crc kubenswrapper[4901]: E0202 11:32:01.683145 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:32:03 crc kubenswrapper[4901]: I0202 11:32:03.755852 4901 generic.go:334] "Generic (PLEG): container finished" podID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerID="d0c06c406fb1f8be00e67b4e511a85dadcffdad1a8f7fa2cc616b575b7493749" exitCode=0 Feb 02 11:32:03 crc kubenswrapper[4901]: I0202 11:32:03.755962 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerDied","Data":"d0c06c406fb1f8be00e67b4e511a85dadcffdad1a8f7fa2cc616b575b7493749"} Feb 02 11:32:04 crc kubenswrapper[4901]: I0202 11:32:04.769315 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerStarted","Data":"7eeb3c119cedf2f2642c2e38fb65a274bc8f6dd2c8f81063181d49339e666496"} Feb 02 11:32:08 crc kubenswrapper[4901]: I0202 11:32:08.819248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerStarted","Data":"abcd9a62b554672ea76101328c1e2ec1fc44383cf2b72378086bbddaff330b12"} Feb 02 11:32:08 crc kubenswrapper[4901]: I0202 11:32:08.821226 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerStarted","Data":"e15602c770926172a18ca34547863f3ec85282a51a7711108c9b76591eb3adf0"} Feb 02 11:32:08 crc kubenswrapper[4901]: I0202 11:32:08.854006 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.853976064 podStartE2EDuration="18.853976064s" podCreationTimestamp="2026-02-02 11:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:32:08.847719573 +0000 UTC m=+3215.866059699" watchObservedRunningTime="2026-02-02 11:32:08.853976064 +0000 UTC m=+3215.872316160" Feb 02 11:32:11 crc kubenswrapper[4901]: I0202 11:32:11.069968 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 11:32:15 crc kubenswrapper[4901]: I0202 11:32:15.678257 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:32:15 crc kubenswrapper[4901]: E0202 11:32:15.679335 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:32:21 crc kubenswrapper[4901]: I0202 11:32:21.069271 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 11:32:21 crc kubenswrapper[4901]: I0202 11:32:21.078377 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 11:32:21 crc kubenswrapper[4901]: I0202 11:32:21.979287 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 11:32:28 crc kubenswrapper[4901]: I0202 11:32:28.677289 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:32:28 crc kubenswrapper[4901]: E0202 11:32:28.679764 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:32:42 crc kubenswrapper[4901]: I0202 11:32:42.677234 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:32:42 crc kubenswrapper[4901]: E0202 11:32:42.678311 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:32:53 crc kubenswrapper[4901]: I0202 11:32:53.689060 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:32:53 crc kubenswrapper[4901]: E0202 11:32:53.690715 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:33:06 crc kubenswrapper[4901]: I0202 11:33:06.678065 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:33:06 crc kubenswrapper[4901]: E0202 11:33:06.679248 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:33:17 crc kubenswrapper[4901]: I0202 11:33:17.677909 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:33:17 crc kubenswrapper[4901]: E0202 11:33:17.679042 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:33:30 crc kubenswrapper[4901]: I0202 11:33:30.676707 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:33:30 crc kubenswrapper[4901]: E0202 11:33:30.677782 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:33:41 crc kubenswrapper[4901]: I0202 11:33:41.678160 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:33:41 crc kubenswrapper[4901]: E0202 11:33:41.679078 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:33:53 crc kubenswrapper[4901]: I0202 11:33:53.687412 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:33:53 crc kubenswrapper[4901]: E0202 11:33:53.688919 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:34:08 crc kubenswrapper[4901]: I0202 11:34:08.676850 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:34:08 crc kubenswrapper[4901]: E0202 11:34:08.678007 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:34:23 crc kubenswrapper[4901]: I0202 11:34:23.684773 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:34:23 crc kubenswrapper[4901]: E0202 11:34:23.685930 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:34:26 crc kubenswrapper[4901]: I0202 11:34:26.733526 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:34:29 crc kubenswrapper[4901]: I0202 11:34:29.350895 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:29 crc kubenswrapper[4901]: I0202 11:34:29.352145 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="prometheus" containerID="cri-o://7eeb3c119cedf2f2642c2e38fb65a274bc8f6dd2c8f81063181d49339e666496" gracePeriod=600 Feb 02 11:34:29 crc kubenswrapper[4901]: I0202 11:34:29.352266 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="thanos-sidecar" containerID="cri-o://abcd9a62b554672ea76101328c1e2ec1fc44383cf2b72378086bbddaff330b12" gracePeriod=600 Feb 02 11:34:29 crc kubenswrapper[4901]: I0202 11:34:29.352315 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="config-reloader" containerID="cri-o://e15602c770926172a18ca34547863f3ec85282a51a7711108c9b76591eb3adf0" gracePeriod=600 Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.233727 4901 generic.go:334] "Generic (PLEG): container finished" podID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerID="abcd9a62b554672ea76101328c1e2ec1fc44383cf2b72378086bbddaff330b12" exitCode=0 Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.234273 4901 generic.go:334] "Generic (PLEG): container finished" podID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerID="e15602c770926172a18ca34547863f3ec85282a51a7711108c9b76591eb3adf0" exitCode=0 Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.233798 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerDied","Data":"abcd9a62b554672ea76101328c1e2ec1fc44383cf2b72378086bbddaff330b12"} Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.234583 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerDied","Data":"e15602c770926172a18ca34547863f3ec85282a51a7711108c9b76591eb3adf0"} Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.234602 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerDied","Data":"7eeb3c119cedf2f2642c2e38fb65a274bc8f6dd2c8f81063181d49339e666496"} Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.234527 4901 generic.go:334] "Generic (PLEG): container finished" podID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerID="7eeb3c119cedf2f2642c2e38fb65a274bc8f6dd2c8f81063181d49339e666496" exitCode=0 Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.385147 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463077 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463231 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463263 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463300 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463480 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463546 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463592 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9p7l\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463659 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463688 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463726 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463820 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463867 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.463920 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file\") pod \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\" (UID: \"437ed0bb-bdbf-4b88-b41a-e01830c4bde2\") " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.466510 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.466540 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.466655 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.492353 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out" (OuterVolumeSpecName: "config-out") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.492397 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config" (OuterVolumeSpecName: "config") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.492582 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.495175 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.495675 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.495799 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l" (OuterVolumeSpecName: "kube-api-access-l9p7l") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "kube-api-access-l9p7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.503036 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.513838 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.516810 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569526 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569593 4901 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569609 4901 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569620 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569632 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569648 4901 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569699 4901 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569716 4901 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569727 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569739 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9p7l\" (UniqueName: \"kubernetes.io/projected/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-kube-api-access-l9p7l\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569750 4901 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.569760 4901 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.637859 4901 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.664311 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config" (OuterVolumeSpecName: "web-config") pod "437ed0bb-bdbf-4b88-b41a-e01830c4bde2" (UID: "437ed0bb-bdbf-4b88-b41a-e01830c4bde2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.672144 4901 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/437ed0bb-bdbf-4b88-b41a-e01830c4bde2-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:30 crc kubenswrapper[4901]: I0202 11:34:30.672192 4901 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.247831 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"437ed0bb-bdbf-4b88-b41a-e01830c4bde2","Type":"ContainerDied","Data":"8dd4b8411287975b2f99039c978dc144cdb146dcd3f4a1a3348acb267e0fa74a"} Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.247890 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.247900 4901 scope.go:117] "RemoveContainer" containerID="abcd9a62b554672ea76101328c1e2ec1fc44383cf2b72378086bbddaff330b12" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.275223 4901 scope.go:117] "RemoveContainer" containerID="e15602c770926172a18ca34547863f3ec85282a51a7711108c9b76591eb3adf0" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.300782 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.307329 4901 scope.go:117] "RemoveContainer" containerID="7eeb3c119cedf2f2642c2e38fb65a274bc8f6dd2c8f81063181d49339e666496" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.317800 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.330148 4901 scope.go:117] "RemoveContainer" containerID="d0c06c406fb1f8be00e67b4e511a85dadcffdad1a8f7fa2cc616b575b7493749" Feb 02 11:34:31 crc kubenswrapper[4901]: I0202 11:34:31.705139 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" path="/var/lib/kubelet/pods/437ed0bb-bdbf-4b88-b41a-e01830c4bde2/volumes" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.191743 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:32 crc kubenswrapper[4901]: E0202 11:34:32.192761 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="thanos-sidecar" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.192787 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="thanos-sidecar" Feb 02 11:34:32 crc kubenswrapper[4901]: E0202 11:34:32.192820 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="prometheus" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.192829 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="prometheus" Feb 02 11:34:32 crc kubenswrapper[4901]: E0202 11:34:32.192855 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="config-reloader" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.192864 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="config-reloader" Feb 02 11:34:32 crc kubenswrapper[4901]: E0202 11:34:32.192892 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="init-config-reloader" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.192901 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="init-config-reloader" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.193142 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="config-reloader" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.193169 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="prometheus" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.193187 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="437ed0bb-bdbf-4b88-b41a-e01830c4bde2" containerName="thanos-sidecar" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.195636 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.200165 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.200431 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.200636 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.200914 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.201112 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.201260 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.201657 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9pb" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.201873 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.225063 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.227409 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309297 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309521 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309553 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309624 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309679 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309729 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.309866 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310091 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310211 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310386 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9f98\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310439 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310461 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.310657 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413059 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413108 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413135 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413192 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413234 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413297 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9f98\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413327 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413350 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413382 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413449 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413528 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413555 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.413611 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.414556 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.414768 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.415112 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.415449 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.420359 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.420370 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.420925 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.421318 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.423656 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.424513 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.426490 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.427088 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.438705 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9f98\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98\") pod \"prometheus-metric-storage-0\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:32 crc kubenswrapper[4901]: I0202 11:34:32.533753 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:33 crc kubenswrapper[4901]: I0202 11:34:33.074540 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:34:33 crc kubenswrapper[4901]: W0202 11:34:33.090758 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddace040f_2ecc_4429_8a62_187c719781bc.slice/crio-e745e96d203c63b3fe8bff3f87fcc5606140b8099a4c3e5c9c5b6018bf77140b WatchSource:0}: Error finding container e745e96d203c63b3fe8bff3f87fcc5606140b8099a4c3e5c9c5b6018bf77140b: Status 404 returned error can't find the container with id e745e96d203c63b3fe8bff3f87fcc5606140b8099a4c3e5c9c5b6018bf77140b Feb 02 11:34:33 crc kubenswrapper[4901]: I0202 11:34:33.284994 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerStarted","Data":"e745e96d203c63b3fe8bff3f87fcc5606140b8099a4c3e5c9c5b6018bf77140b"} Feb 02 11:34:37 crc kubenswrapper[4901]: I0202 11:34:37.329494 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerStarted","Data":"02a2b9d039704382c3774f513fdae1b4a5a2e322c3db6cf9bacad654bd63116b"} Feb 02 11:34:37 crc kubenswrapper[4901]: I0202 11:34:37.677289 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:34:37 crc kubenswrapper[4901]: E0202 11:34:37.677988 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:34:44 crc kubenswrapper[4901]: I0202 11:34:44.419508 4901 generic.go:334] "Generic (PLEG): container finished" podID="dace040f-2ecc-4429-8a62-187c719781bc" containerID="02a2b9d039704382c3774f513fdae1b4a5a2e322c3db6cf9bacad654bd63116b" exitCode=0 Feb 02 11:34:44 crc kubenswrapper[4901]: I0202 11:34:44.419668 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerDied","Data":"02a2b9d039704382c3774f513fdae1b4a5a2e322c3db6cf9bacad654bd63116b"} Feb 02 11:34:45 crc kubenswrapper[4901]: I0202 11:34:45.432315 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerStarted","Data":"004dce19b1d6b44ebfe04d914c139900b5ed50a7fc36eb317cc6a90ed27b7c72"} Feb 02 11:34:49 crc kubenswrapper[4901]: I0202 11:34:49.486441 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerStarted","Data":"27f8ed051508e4a2b487b45d1ea8a8c1f079961cc586fa4477ca4c57955eacf4"} Feb 02 11:34:49 crc kubenswrapper[4901]: I0202 11:34:49.487107 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerStarted","Data":"52a12ab497d25afee14f76273934d866a47cd5cadcffe186af735598ea081bd8"} Feb 02 11:34:49 crc kubenswrapper[4901]: I0202 11:34:49.525204 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.525170663 podStartE2EDuration="17.525170663s" podCreationTimestamp="2026-02-02 11:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:34:49.515065998 +0000 UTC m=+3376.533406134" watchObservedRunningTime="2026-02-02 11:34:49.525170663 +0000 UTC m=+3376.543510759" Feb 02 11:34:52 crc kubenswrapper[4901]: I0202 11:34:52.535917 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 11:34:52 crc kubenswrapper[4901]: I0202 11:34:52.677965 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:34:52 crc kubenswrapper[4901]: E0202 11:34:52.678645 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:35:02 crc kubenswrapper[4901]: I0202 11:35:02.534865 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 11:35:02 crc kubenswrapper[4901]: I0202 11:35:02.541662 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 11:35:02 crc kubenswrapper[4901]: I0202 11:35:02.940873 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 11:35:07 crc kubenswrapper[4901]: I0202 11:35:07.678264 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:35:07 crc kubenswrapper[4901]: E0202 11:35:07.679396 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:35:19 crc kubenswrapper[4901]: I0202 11:35:19.676494 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:35:19 crc kubenswrapper[4901]: E0202 11:35:19.677427 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:35:31 crc kubenswrapper[4901]: I0202 11:35:31.677541 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:35:31 crc kubenswrapper[4901]: E0202 11:35:31.678677 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:35:42 crc kubenswrapper[4901]: I0202 11:35:42.677363 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:35:43 crc kubenswrapper[4901]: I0202 11:35:43.374491 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1"} Feb 02 11:36:04 crc kubenswrapper[4901]: I0202 11:36:04.061853 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-2dm7v"] Feb 02 11:36:04 crc kubenswrapper[4901]: I0202 11:36:04.073627 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-2dm7v"] Feb 02 11:36:05 crc kubenswrapper[4901]: I0202 11:36:05.035641 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-5f08-account-create-update-vqqf2"] Feb 02 11:36:05 crc kubenswrapper[4901]: I0202 11:36:05.043927 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-5f08-account-create-update-vqqf2"] Feb 02 11:36:05 crc kubenswrapper[4901]: I0202 11:36:05.691253 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9b075b-311e-4ff0-a0c3-24e65adb3cf7" path="/var/lib/kubelet/pods/0c9b075b-311e-4ff0-a0c3-24e65adb3cf7/volumes" Feb 02 11:36:05 crc kubenswrapper[4901]: I0202 11:36:05.692218 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b241db-5cd0-4121-a48e-64875cfcf4f0" path="/var/lib/kubelet/pods/c8b241db-5cd0-4121-a48e-64875cfcf4f0/volumes" Feb 02 11:36:50 crc kubenswrapper[4901]: I0202 11:36:50.360416 4901 scope.go:117] "RemoveContainer" containerID="7b735cb965c2c2a302d6cffa9926c02680df87dd790adce766a7bb7b60391d36" Feb 02 11:36:50 crc kubenswrapper[4901]: I0202 11:36:50.397973 4901 scope.go:117] "RemoveContainer" containerID="31b5639577ae218f63ce0e432b979a4f8045d9ecbc0f7cb5c873fc7114c15bcb" Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.842403 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.846469 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.854248 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.967688 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.967765 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6c2g\" (UniqueName: \"kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:01 crc kubenswrapper[4901]: I0202 11:37:01.967790 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.070601 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.071177 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.071278 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.071406 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6c2g\" (UniqueName: \"kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.071750 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.099129 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6c2g\" (UniqueName: \"kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g\") pod \"redhat-marketplace-v55kp\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.201082 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:02 crc kubenswrapper[4901]: I0202 11:37:02.559240 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:03 crc kubenswrapper[4901]: I0202 11:37:03.260009 4901 generic.go:334] "Generic (PLEG): container finished" podID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerID="540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e" exitCode=0 Feb 02 11:37:03 crc kubenswrapper[4901]: I0202 11:37:03.260113 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerDied","Data":"540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e"} Feb 02 11:37:03 crc kubenswrapper[4901]: I0202 11:37:03.260511 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerStarted","Data":"0060b14dce3da17919b44e57c978c19b0ab5ed129e6340e459d04247a28bc69d"} Feb 02 11:37:03 crc kubenswrapper[4901]: I0202 11:37:03.262235 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:37:04 crc kubenswrapper[4901]: I0202 11:37:04.272549 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerStarted","Data":"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f"} Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.018736 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.021373 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.035714 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.144512 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbksd\" (UniqueName: \"kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.144679 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.144721 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.247129 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.247209 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.247331 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbksd\" (UniqueName: \"kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.248425 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.248708 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.275105 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbksd\" (UniqueName: \"kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd\") pod \"certified-operators-hvglx\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.286564 4901 generic.go:334] "Generic (PLEG): container finished" podID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerID="001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f" exitCode=0 Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.286743 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerDied","Data":"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f"} Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.344217 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:05 crc kubenswrapper[4901]: I0202 11:37:05.845741 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:05 crc kubenswrapper[4901]: W0202 11:37:05.852807 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18319a0_b6c9_4336_a6b2_3c5b45e5852d.slice/crio-82d39888859b905812135746d25df31737e82e4ccee9df7c44f312efd052e304 WatchSource:0}: Error finding container 82d39888859b905812135746d25df31737e82e4ccee9df7c44f312efd052e304: Status 404 returned error can't find the container with id 82d39888859b905812135746d25df31737e82e4ccee9df7c44f312efd052e304 Feb 02 11:37:06 crc kubenswrapper[4901]: I0202 11:37:06.301342 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerStarted","Data":"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf"} Feb 02 11:37:06 crc kubenswrapper[4901]: I0202 11:37:06.304013 4901 generic.go:334] "Generic (PLEG): container finished" podID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerID="9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507" exitCode=0 Feb 02 11:37:06 crc kubenswrapper[4901]: I0202 11:37:06.304106 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerDied","Data":"9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507"} Feb 02 11:37:06 crc kubenswrapper[4901]: I0202 11:37:06.304187 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerStarted","Data":"82d39888859b905812135746d25df31737e82e4ccee9df7c44f312efd052e304"} Feb 02 11:37:06 crc kubenswrapper[4901]: I0202 11:37:06.331562 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v55kp" podStartSLOduration=2.836986061 podStartE2EDuration="5.331531324s" podCreationTimestamp="2026-02-02 11:37:01 +0000 UTC" firstStartedPulling="2026-02-02 11:37:03.262015554 +0000 UTC m=+3510.280355640" lastFinishedPulling="2026-02-02 11:37:05.756560807 +0000 UTC m=+3512.774900903" observedRunningTime="2026-02-02 11:37:06.327864374 +0000 UTC m=+3513.346204470" watchObservedRunningTime="2026-02-02 11:37:06.331531324 +0000 UTC m=+3513.349871440" Feb 02 11:37:07 crc kubenswrapper[4901]: I0202 11:37:07.328459 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerStarted","Data":"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e"} Feb 02 11:37:08 crc kubenswrapper[4901]: I0202 11:37:08.345469 4901 generic.go:334] "Generic (PLEG): container finished" podID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerID="151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e" exitCode=0 Feb 02 11:37:08 crc kubenswrapper[4901]: I0202 11:37:08.345539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerDied","Data":"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e"} Feb 02 11:37:09 crc kubenswrapper[4901]: I0202 11:37:09.389817 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerStarted","Data":"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c"} Feb 02 11:37:09 crc kubenswrapper[4901]: I0202 11:37:09.425281 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hvglx" podStartSLOduration=2.946682816 podStartE2EDuration="5.42525866s" podCreationTimestamp="2026-02-02 11:37:04 +0000 UTC" firstStartedPulling="2026-02-02 11:37:06.306735692 +0000 UTC m=+3513.325075788" lastFinishedPulling="2026-02-02 11:37:08.785311536 +0000 UTC m=+3515.803651632" observedRunningTime="2026-02-02 11:37:09.419884019 +0000 UTC m=+3516.438224115" watchObservedRunningTime="2026-02-02 11:37:09.42525866 +0000 UTC m=+3516.443598756" Feb 02 11:37:12 crc kubenswrapper[4901]: I0202 11:37:12.201454 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:12 crc kubenswrapper[4901]: I0202 11:37:12.201996 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:12 crc kubenswrapper[4901]: I0202 11:37:12.253779 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:12 crc kubenswrapper[4901]: I0202 11:37:12.476005 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:12 crc kubenswrapper[4901]: I0202 11:37:12.814238 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:14 crc kubenswrapper[4901]: I0202 11:37:14.442531 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v55kp" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="registry-server" containerID="cri-o://4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf" gracePeriod=2 Feb 02 11:37:14 crc kubenswrapper[4901]: I0202 11:37:14.961195 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.046913 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities\") pod \"001976d2-4dac-42f0-b6d0-2988bccf5847\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.047099 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content\") pod \"001976d2-4dac-42f0-b6d0-2988bccf5847\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.047385 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6c2g\" (UniqueName: \"kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g\") pod \"001976d2-4dac-42f0-b6d0-2988bccf5847\" (UID: \"001976d2-4dac-42f0-b6d0-2988bccf5847\") " Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.050386 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities" (OuterVolumeSpecName: "utilities") pod "001976d2-4dac-42f0-b6d0-2988bccf5847" (UID: "001976d2-4dac-42f0-b6d0-2988bccf5847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.057145 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g" (OuterVolumeSpecName: "kube-api-access-p6c2g") pod "001976d2-4dac-42f0-b6d0-2988bccf5847" (UID: "001976d2-4dac-42f0-b6d0-2988bccf5847"). InnerVolumeSpecName "kube-api-access-p6c2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.080374 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "001976d2-4dac-42f0-b6d0-2988bccf5847" (UID: "001976d2-4dac-42f0-b6d0-2988bccf5847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.150679 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.150993 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001976d2-4dac-42f0-b6d0-2988bccf5847-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.151013 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6c2g\" (UniqueName: \"kubernetes.io/projected/001976d2-4dac-42f0-b6d0-2988bccf5847-kube-api-access-p6c2g\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.344579 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.344758 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.400062 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.455922 4901 generic.go:334] "Generic (PLEG): container finished" podID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerID="4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf" exitCode=0 Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.456018 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v55kp" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.456050 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerDied","Data":"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf"} Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.456115 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v55kp" event={"ID":"001976d2-4dac-42f0-b6d0-2988bccf5847","Type":"ContainerDied","Data":"0060b14dce3da17919b44e57c978c19b0ab5ed129e6340e459d04247a28bc69d"} Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.456137 4901 scope.go:117] "RemoveContainer" containerID="4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.489815 4901 scope.go:117] "RemoveContainer" containerID="001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.504055 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.513060 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.520385 4901 scope.go:117] "RemoveContainer" containerID="540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.526697 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v55kp"] Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.578819 4901 scope.go:117] "RemoveContainer" containerID="4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf" Feb 02 11:37:15 crc kubenswrapper[4901]: E0202 11:37:15.579533 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf\": container with ID starting with 4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf not found: ID does not exist" containerID="4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.579607 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf"} err="failed to get container status \"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf\": rpc error: code = NotFound desc = could not find container \"4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf\": container with ID starting with 4073d58b50fb8317622471fa3b0f69508ea19c0163fe845765eaea7f8ef660bf not found: ID does not exist" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.579640 4901 scope.go:117] "RemoveContainer" containerID="001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f" Feb 02 11:37:15 crc kubenswrapper[4901]: E0202 11:37:15.580187 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f\": container with ID starting with 001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f not found: ID does not exist" containerID="001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.580223 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f"} err="failed to get container status \"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f\": rpc error: code = NotFound desc = could not find container \"001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f\": container with ID starting with 001b22968e22dcb200ca0df7527edfc8881ca4645b89aaa5199a12519f5fff2f not found: ID does not exist" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.580242 4901 scope.go:117] "RemoveContainer" containerID="540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e" Feb 02 11:37:15 crc kubenswrapper[4901]: E0202 11:37:15.580625 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e\": container with ID starting with 540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e not found: ID does not exist" containerID="540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.580675 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e"} err="failed to get container status \"540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e\": rpc error: code = NotFound desc = could not find container \"540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e\": container with ID starting with 540d476d6b833ac2c573558a5b1979c9e1686ead704b86009b639868a955b97e not found: ID does not exist" Feb 02 11:37:15 crc kubenswrapper[4901]: I0202 11:37:15.692451 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" path="/var/lib/kubelet/pods/001976d2-4dac-42f0-b6d0-2988bccf5847/volumes" Feb 02 11:37:17 crc kubenswrapper[4901]: I0202 11:37:17.814160 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:18 crc kubenswrapper[4901]: I0202 11:37:18.492016 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hvglx" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="registry-server" containerID="cri-o://6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c" gracePeriod=2 Feb 02 11:37:18 crc kubenswrapper[4901]: E0202 11:37:18.529012 4901 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18319a0_b6c9_4336_a6b2_3c5b45e5852d.slice/crio-6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:37:18 crc kubenswrapper[4901]: I0202 11:37:18.996232 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.152213 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbksd\" (UniqueName: \"kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd\") pod \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.152422 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities\") pod \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.152461 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content\") pod \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\" (UID: \"b18319a0-b6c9-4336-a6b2-3c5b45e5852d\") " Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.154995 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities" (OuterVolumeSpecName: "utilities") pod "b18319a0-b6c9-4336-a6b2-3c5b45e5852d" (UID: "b18319a0-b6c9-4336-a6b2-3c5b45e5852d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.172010 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd" (OuterVolumeSpecName: "kube-api-access-sbksd") pod "b18319a0-b6c9-4336-a6b2-3c5b45e5852d" (UID: "b18319a0-b6c9-4336-a6b2-3c5b45e5852d"). InnerVolumeSpecName "kube-api-access-sbksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.206280 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b18319a0-b6c9-4336-a6b2-3c5b45e5852d" (UID: "b18319a0-b6c9-4336-a6b2-3c5b45e5852d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.255643 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbksd\" (UniqueName: \"kubernetes.io/projected/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-kube-api-access-sbksd\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.256102 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.256181 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18319a0-b6c9-4336-a6b2-3c5b45e5852d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.510297 4901 generic.go:334] "Generic (PLEG): container finished" podID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerID="6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c" exitCode=0 Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.510357 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerDied","Data":"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c"} Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.510403 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hvglx" event={"ID":"b18319a0-b6c9-4336-a6b2-3c5b45e5852d","Type":"ContainerDied","Data":"82d39888859b905812135746d25df31737e82e4ccee9df7c44f312efd052e304"} Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.510425 4901 scope.go:117] "RemoveContainer" containerID="6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.511826 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hvglx" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.539431 4901 scope.go:117] "RemoveContainer" containerID="151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.576145 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.592597 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hvglx"] Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.595458 4901 scope.go:117] "RemoveContainer" containerID="9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.639950 4901 scope.go:117] "RemoveContainer" containerID="6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c" Feb 02 11:37:19 crc kubenswrapper[4901]: E0202 11:37:19.640816 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c\": container with ID starting with 6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c not found: ID does not exist" containerID="6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.640882 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c"} err="failed to get container status \"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c\": rpc error: code = NotFound desc = could not find container \"6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c\": container with ID starting with 6230c1105ffdf3b58648b7e09b6973c9feabcffcc786a499bc628f173421ba4c not found: ID does not exist" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.640919 4901 scope.go:117] "RemoveContainer" containerID="151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e" Feb 02 11:37:19 crc kubenswrapper[4901]: E0202 11:37:19.641545 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e\": container with ID starting with 151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e not found: ID does not exist" containerID="151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.641583 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e"} err="failed to get container status \"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e\": rpc error: code = NotFound desc = could not find container \"151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e\": container with ID starting with 151ee755c42ef5a07de41e6a0158140dc5789433917a188d7b9585188383ab7e not found: ID does not exist" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.641599 4901 scope.go:117] "RemoveContainer" containerID="9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507" Feb 02 11:37:19 crc kubenswrapper[4901]: E0202 11:37:19.641941 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507\": container with ID starting with 9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507 not found: ID does not exist" containerID="9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.641977 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507"} err="failed to get container status \"9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507\": rpc error: code = NotFound desc = could not find container \"9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507\": container with ID starting with 9062cf09d1d5228a5a70a9f42f328d3b39b85d6f3a246567ccbbfd2025810507 not found: ID does not exist" Feb 02 11:37:19 crc kubenswrapper[4901]: I0202 11:37:19.701524 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" path="/var/lib/kubelet/pods/b18319a0-b6c9-4336-a6b2-3c5b45e5852d/volumes" Feb 02 11:38:07 crc kubenswrapper[4901]: I0202 11:38:07.837360 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:38:07 crc kubenswrapper[4901]: I0202 11:38:07.838124 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:38:28 crc kubenswrapper[4901]: I0202 11:38:28.400021 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:38:37 crc kubenswrapper[4901]: I0202 11:38:37.837183 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:38:37 crc kubenswrapper[4901]: I0202 11:38:37.838131 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:07 crc kubenswrapper[4901]: I0202 11:39:07.837927 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:39:07 crc kubenswrapper[4901]: I0202 11:39:07.838754 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:07 crc kubenswrapper[4901]: I0202 11:39:07.838809 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:39:07 crc kubenswrapper[4901]: I0202 11:39:07.839882 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:39:07 crc kubenswrapper[4901]: I0202 11:39:07.839956 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1" gracePeriod=600 Feb 02 11:39:09 crc kubenswrapper[4901]: I0202 11:39:08.931505 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1" exitCode=0 Feb 02 11:39:09 crc kubenswrapper[4901]: I0202 11:39:08.931640 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1"} Feb 02 11:39:09 crc kubenswrapper[4901]: I0202 11:39:08.932414 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435"} Feb 02 11:39:09 crc kubenswrapper[4901]: I0202 11:39:08.932445 4901 scope.go:117] "RemoveContainer" containerID="57fa2da00216833a4d6895d268a4adace1de13c875f7256b0e6997572d9f7c2a" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.743677 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745167 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="extract-content" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745187 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="extract-content" Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745202 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745210 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745220 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="extract-utilities" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745227 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="extract-utilities" Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745258 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="extract-content" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745264 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="extract-content" Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745288 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="extract-utilities" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745296 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="extract-utilities" Feb 02 11:40:02 crc kubenswrapper[4901]: E0202 11:40:02.745320 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745326 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745577 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18319a0-b6c9-4336-a6b2-3c5b45e5852d" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.745586 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="001976d2-4dac-42f0-b6d0-2988bccf5847" containerName="registry-server" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.747550 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.755402 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.856469 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzjkb\" (UniqueName: \"kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.856872 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.856960 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.958555 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzjkb\" (UniqueName: \"kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.959314 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.959360 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.959893 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.959949 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:02 crc kubenswrapper[4901]: I0202 11:40:02.980787 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzjkb\" (UniqueName: \"kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb\") pod \"redhat-operators-m4lrm\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:03 crc kubenswrapper[4901]: I0202 11:40:03.083722 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:03 crc kubenswrapper[4901]: I0202 11:40:03.698911 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:04 crc kubenswrapper[4901]: I0202 11:40:04.582596 4901 generic.go:334] "Generic (PLEG): container finished" podID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerID="137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a" exitCode=0 Feb 02 11:40:04 crc kubenswrapper[4901]: I0202 11:40:04.582696 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerDied","Data":"137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a"} Feb 02 11:40:04 crc kubenswrapper[4901]: I0202 11:40:04.583062 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerStarted","Data":"de3f23f522c6927755d7caa353da0b5789a5034ab935e68fe48a21edd0fc9730"} Feb 02 11:40:06 crc kubenswrapper[4901]: I0202 11:40:06.604459 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerStarted","Data":"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5"} Feb 02 11:40:08 crc kubenswrapper[4901]: I0202 11:40:08.626592 4901 generic.go:334] "Generic (PLEG): container finished" podID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerID="43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5" exitCode=0 Feb 02 11:40:08 crc kubenswrapper[4901]: I0202 11:40:08.626676 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerDied","Data":"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5"} Feb 02 11:40:09 crc kubenswrapper[4901]: I0202 11:40:09.641618 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerStarted","Data":"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d"} Feb 02 11:40:09 crc kubenswrapper[4901]: I0202 11:40:09.664614 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m4lrm" podStartSLOduration=3.227143358 podStartE2EDuration="7.664591222s" podCreationTimestamp="2026-02-02 11:40:02 +0000 UTC" firstStartedPulling="2026-02-02 11:40:04.585070442 +0000 UTC m=+3691.603410538" lastFinishedPulling="2026-02-02 11:40:09.022518306 +0000 UTC m=+3696.040858402" observedRunningTime="2026-02-02 11:40:09.662167663 +0000 UTC m=+3696.680507769" watchObservedRunningTime="2026-02-02 11:40:09.664591222 +0000 UTC m=+3696.682931778" Feb 02 11:40:13 crc kubenswrapper[4901]: I0202 11:40:13.084674 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:13 crc kubenswrapper[4901]: I0202 11:40:13.085505 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:14 crc kubenswrapper[4901]: I0202 11:40:14.136098 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4lrm" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="registry-server" probeResult="failure" output=< Feb 02 11:40:14 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:40:14 crc kubenswrapper[4901]: > Feb 02 11:40:23 crc kubenswrapper[4901]: I0202 11:40:23.148105 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:23 crc kubenswrapper[4901]: I0202 11:40:23.218811 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:23 crc kubenswrapper[4901]: I0202 11:40:23.394051 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:24 crc kubenswrapper[4901]: I0202 11:40:24.804135 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m4lrm" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="registry-server" containerID="cri-o://9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d" gracePeriod=2 Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.354245 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.473866 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content\") pod \"09f14f4f-0a46-41bb-8413-b45bc2033606\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.473919 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities\") pod \"09f14f4f-0a46-41bb-8413-b45bc2033606\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.474164 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzjkb\" (UniqueName: \"kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb\") pod \"09f14f4f-0a46-41bb-8413-b45bc2033606\" (UID: \"09f14f4f-0a46-41bb-8413-b45bc2033606\") " Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.474815 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities" (OuterVolumeSpecName: "utilities") pod "09f14f4f-0a46-41bb-8413-b45bc2033606" (UID: "09f14f4f-0a46-41bb-8413-b45bc2033606"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.484105 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb" (OuterVolumeSpecName: "kube-api-access-nzjkb") pod "09f14f4f-0a46-41bb-8413-b45bc2033606" (UID: "09f14f4f-0a46-41bb-8413-b45bc2033606"). InnerVolumeSpecName "kube-api-access-nzjkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.576064 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzjkb\" (UniqueName: \"kubernetes.io/projected/09f14f4f-0a46-41bb-8413-b45bc2033606-kube-api-access-nzjkb\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.576104 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.602677 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09f14f4f-0a46-41bb-8413-b45bc2033606" (UID: "09f14f4f-0a46-41bb-8413-b45bc2033606"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.677974 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f14f4f-0a46-41bb-8413-b45bc2033606-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.820383 4901 generic.go:334] "Generic (PLEG): container finished" podID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerID="9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d" exitCode=0 Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.820460 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4lrm" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.820482 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerDied","Data":"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d"} Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.820598 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4lrm" event={"ID":"09f14f4f-0a46-41bb-8413-b45bc2033606","Type":"ContainerDied","Data":"de3f23f522c6927755d7caa353da0b5789a5034ab935e68fe48a21edd0fc9730"} Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.820635 4901 scope.go:117] "RemoveContainer" containerID="9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.851948 4901 scope.go:117] "RemoveContainer" containerID="43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.855509 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.867622 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m4lrm"] Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.890939 4901 scope.go:117] "RemoveContainer" containerID="137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.931950 4901 scope.go:117] "RemoveContainer" containerID="9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d" Feb 02 11:40:25 crc kubenswrapper[4901]: E0202 11:40:25.932154 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d\": container with ID starting with 9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d not found: ID does not exist" containerID="9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.932190 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d"} err="failed to get container status \"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d\": rpc error: code = NotFound desc = could not find container \"9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d\": container with ID starting with 9c37c4cbac202731d08296a3b8aaa6898e1efbcb646911de8e1484e943b0f26d not found: ID does not exist" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.932220 4901 scope.go:117] "RemoveContainer" containerID="43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5" Feb 02 11:40:25 crc kubenswrapper[4901]: E0202 11:40:25.932397 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5\": container with ID starting with 43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5 not found: ID does not exist" containerID="43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.932417 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5"} err="failed to get container status \"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5\": rpc error: code = NotFound desc = could not find container \"43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5\": container with ID starting with 43dd45523e78aee28cdd5aa48ad0fba01bd617c9cbc09309bf7e932f3ead0ca5 not found: ID does not exist" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.932433 4901 scope.go:117] "RemoveContainer" containerID="137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a" Feb 02 11:40:25 crc kubenswrapper[4901]: E0202 11:40:25.932655 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a\": container with ID starting with 137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a not found: ID does not exist" containerID="137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a" Feb 02 11:40:25 crc kubenswrapper[4901]: I0202 11:40:25.932679 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a"} err="failed to get container status \"137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a\": rpc error: code = NotFound desc = could not find container \"137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a\": container with ID starting with 137883c4b49ed86d0a853a728d4bfef9e700a6c7bafdd25a6ba43bcc26493c2a not found: ID does not exist" Feb 02 11:40:27 crc kubenswrapper[4901]: I0202 11:40:27.689786 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" path="/var/lib/kubelet/pods/09f14f4f-0a46-41bb-8413-b45bc2033606/volumes" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.656687 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:14 crc kubenswrapper[4901]: E0202 11:41:14.660158 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="registry-server" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.660245 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="registry-server" Feb 02 11:41:14 crc kubenswrapper[4901]: E0202 11:41:14.660301 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="extract-utilities" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.660351 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="extract-utilities" Feb 02 11:41:14 crc kubenswrapper[4901]: E0202 11:41:14.660448 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="extract-content" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.660504 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="extract-content" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.660761 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f14f4f-0a46-41bb-8413-b45bc2033606" containerName="registry-server" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.662388 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.690296 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.751089 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.751527 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxhl\" (UniqueName: \"kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.752011 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.855206 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.855340 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxhl\" (UniqueName: \"kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.855410 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.855863 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.855957 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:14 crc kubenswrapper[4901]: I0202 11:41:14.887254 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxhl\" (UniqueName: \"kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl\") pod \"community-operators-jssct\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:15 crc kubenswrapper[4901]: I0202 11:41:15.005240 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:15 crc kubenswrapper[4901]: I0202 11:41:15.592881 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:16 crc kubenswrapper[4901]: I0202 11:41:16.375256 4901 generic.go:334] "Generic (PLEG): container finished" podID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerID="e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0" exitCode=0 Feb 02 11:41:16 crc kubenswrapper[4901]: I0202 11:41:16.375313 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerDied","Data":"e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0"} Feb 02 11:41:16 crc kubenswrapper[4901]: I0202 11:41:16.375620 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerStarted","Data":"b0e5f7ef5ab78a81e9dec0138e66a3b20b6203e27c137e999d82c75cb240ff92"} Feb 02 11:41:18 crc kubenswrapper[4901]: I0202 11:41:18.420038 4901 generic.go:334] "Generic (PLEG): container finished" podID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerID="4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09" exitCode=0 Feb 02 11:41:18 crc kubenswrapper[4901]: I0202 11:41:18.420248 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerDied","Data":"4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09"} Feb 02 11:41:19 crc kubenswrapper[4901]: I0202 11:41:19.435483 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerStarted","Data":"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252"} Feb 02 11:41:19 crc kubenswrapper[4901]: I0202 11:41:19.459071 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jssct" podStartSLOduration=2.831543456 podStartE2EDuration="5.459045417s" podCreationTimestamp="2026-02-02 11:41:14 +0000 UTC" firstStartedPulling="2026-02-02 11:41:16.378397403 +0000 UTC m=+3763.396737499" lastFinishedPulling="2026-02-02 11:41:19.005899344 +0000 UTC m=+3766.024239460" observedRunningTime="2026-02-02 11:41:19.454576789 +0000 UTC m=+3766.472916885" watchObservedRunningTime="2026-02-02 11:41:19.459045417 +0000 UTC m=+3766.477385513" Feb 02 11:41:25 crc kubenswrapper[4901]: I0202 11:41:25.005980 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:25 crc kubenswrapper[4901]: I0202 11:41:25.006833 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:25 crc kubenswrapper[4901]: I0202 11:41:25.066665 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:25 crc kubenswrapper[4901]: I0202 11:41:25.553123 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:25 crc kubenswrapper[4901]: I0202 11:41:25.626230 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:27 crc kubenswrapper[4901]: I0202 11:41:27.516182 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jssct" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="registry-server" containerID="cri-o://de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252" gracePeriod=2 Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.003793 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.169906 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxhl\" (UniqueName: \"kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl\") pod \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.170234 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content\") pod \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.170307 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities\") pod \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\" (UID: \"a5fb53d5-b3bf-4018-8a67-0250b54a44fc\") " Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.171239 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities" (OuterVolumeSpecName: "utilities") pod "a5fb53d5-b3bf-4018-8a67-0250b54a44fc" (UID: "a5fb53d5-b3bf-4018-8a67-0250b54a44fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.189012 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl" (OuterVolumeSpecName: "kube-api-access-4cxhl") pod "a5fb53d5-b3bf-4018-8a67-0250b54a44fc" (UID: "a5fb53d5-b3bf-4018-8a67-0250b54a44fc"). InnerVolumeSpecName "kube-api-access-4cxhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.273555 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxhl\" (UniqueName: \"kubernetes.io/projected/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-kube-api-access-4cxhl\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.273625 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.439676 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5fb53d5-b3bf-4018-8a67-0250b54a44fc" (UID: "a5fb53d5-b3bf-4018-8a67-0250b54a44fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.476995 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5fb53d5-b3bf-4018-8a67-0250b54a44fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.527845 4901 generic.go:334] "Generic (PLEG): container finished" podID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerID="de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252" exitCode=0 Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.527928 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jssct" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.527938 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerDied","Data":"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252"} Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.528044 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jssct" event={"ID":"a5fb53d5-b3bf-4018-8a67-0250b54a44fc","Type":"ContainerDied","Data":"b0e5f7ef5ab78a81e9dec0138e66a3b20b6203e27c137e999d82c75cb240ff92"} Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.528077 4901 scope.go:117] "RemoveContainer" containerID="de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.563252 4901 scope.go:117] "RemoveContainer" containerID="4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.568871 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.581881 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jssct"] Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.589018 4901 scope.go:117] "RemoveContainer" containerID="e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.630412 4901 scope.go:117] "RemoveContainer" containerID="de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252" Feb 02 11:41:28 crc kubenswrapper[4901]: E0202 11:41:28.631027 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252\": container with ID starting with de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252 not found: ID does not exist" containerID="de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.631064 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252"} err="failed to get container status \"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252\": rpc error: code = NotFound desc = could not find container \"de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252\": container with ID starting with de5c44bbd73986c611a177727456c804b901e043877a01d15f7c7d027631d252 not found: ID does not exist" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.631086 4901 scope.go:117] "RemoveContainer" containerID="4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09" Feb 02 11:41:28 crc kubenswrapper[4901]: E0202 11:41:28.631320 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09\": container with ID starting with 4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09 not found: ID does not exist" containerID="4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.631347 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09"} err="failed to get container status \"4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09\": rpc error: code = NotFound desc = could not find container \"4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09\": container with ID starting with 4674df3ed42fceb5b44d943806ad533cc5b541be5571f811cb3b13ef6bad8b09 not found: ID does not exist" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.631359 4901 scope.go:117] "RemoveContainer" containerID="e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0" Feb 02 11:41:28 crc kubenswrapper[4901]: E0202 11:41:28.631914 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0\": container with ID starting with e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0 not found: ID does not exist" containerID="e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0" Feb 02 11:41:28 crc kubenswrapper[4901]: I0202 11:41:28.631937 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0"} err="failed to get container status \"e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0\": rpc error: code = NotFound desc = could not find container \"e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0\": container with ID starting with e8094347e698969a536a441abf11509db9b0ae12e701b5966af5bc926f9c1cb0 not found: ID does not exist" Feb 02 11:41:29 crc kubenswrapper[4901]: I0202 11:41:29.692610 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" path="/var/lib/kubelet/pods/a5fb53d5-b3bf-4018-8a67-0250b54a44fc/volumes" Feb 02 11:41:37 crc kubenswrapper[4901]: I0202 11:41:37.837455 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:41:37 crc kubenswrapper[4901]: I0202 11:41:37.838424 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:07 crc kubenswrapper[4901]: I0202 11:42:07.837479 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:42:07 crc kubenswrapper[4901]: I0202 11:42:07.838275 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:29 crc kubenswrapper[4901]: I0202 11:42:29.961431 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:42:33 crc kubenswrapper[4901]: I0202 11:42:33.599099 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:33 crc kubenswrapper[4901]: I0202 11:42:33.600843 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="prometheus" containerID="cri-o://004dce19b1d6b44ebfe04d914c139900b5ed50a7fc36eb317cc6a90ed27b7c72" gracePeriod=600 Feb 02 11:42:33 crc kubenswrapper[4901]: I0202 11:42:33.601359 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="thanos-sidecar" containerID="cri-o://27f8ed051508e4a2b487b45d1ea8a8c1f079961cc586fa4477ca4c57955eacf4" gracePeriod=600 Feb 02 11:42:33 crc kubenswrapper[4901]: I0202 11:42:33.601786 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="config-reloader" containerID="cri-o://52a12ab497d25afee14f76273934d866a47cd5cadcffe186af735598ea081bd8" gracePeriod=600 Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.228298 4901 generic.go:334] "Generic (PLEG): container finished" podID="dace040f-2ecc-4429-8a62-187c719781bc" containerID="27f8ed051508e4a2b487b45d1ea8a8c1f079961cc586fa4477ca4c57955eacf4" exitCode=0 Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.229041 4901 generic.go:334] "Generic (PLEG): container finished" podID="dace040f-2ecc-4429-8a62-187c719781bc" containerID="52a12ab497d25afee14f76273934d866a47cd5cadcffe186af735598ea081bd8" exitCode=0 Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.229120 4901 generic.go:334] "Generic (PLEG): container finished" podID="dace040f-2ecc-4429-8a62-187c719781bc" containerID="004dce19b1d6b44ebfe04d914c139900b5ed50a7fc36eb317cc6a90ed27b7c72" exitCode=0 Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.229200 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerDied","Data":"27f8ed051508e4a2b487b45d1ea8a8c1f079961cc586fa4477ca4c57955eacf4"} Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.229287 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerDied","Data":"52a12ab497d25afee14f76273934d866a47cd5cadcffe186af735598ea081bd8"} Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.229373 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerDied","Data":"004dce19b1d6b44ebfe04d914c139900b5ed50a7fc36eb317cc6a90ed27b7c72"} Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.753182 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.756691 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.757109 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.757152 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9f98\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.757171 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.757766 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.758433 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.763777 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98" (OuterVolumeSpecName: "kube-api-access-w9f98") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "kube-api-access-w9f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.764271 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config" (OuterVolumeSpecName: "config") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859541 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859686 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859739 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859811 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859839 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859904 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859940 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.859987 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.860114 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db\") pod \"dace040f-2ecc-4429-8a62-187c719781bc\" (UID: \"dace040f-2ecc-4429-8a62-187c719781bc\") " Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.860751 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.860837 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9f98\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-kube-api-access-w9f98\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.860859 4901 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.861334 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.861973 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.863855 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.864488 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.865614 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.867137 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out" (OuterVolumeSpecName: "config-out") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.868006 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.868602 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.888891 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config" (OuterVolumeSpecName: "web-config") pod "dace040f-2ecc-4429-8a62-187c719781bc" (UID: "dace040f-2ecc-4429-8a62-187c719781bc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962434 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962533 4901 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962581 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962605 4901 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962632 4901 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dace040f-2ecc-4429-8a62-187c719781bc-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962655 4901 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962675 4901 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/dace040f-2ecc-4429-8a62-187c719781bc-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962694 4901 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962714 4901 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dace040f-2ecc-4429-8a62-187c719781bc-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:34 crc kubenswrapper[4901]: I0202 11:42:34.962732 4901 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dace040f-2ecc-4429-8a62-187c719781bc-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.246702 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"dace040f-2ecc-4429-8a62-187c719781bc","Type":"ContainerDied","Data":"e745e96d203c63b3fe8bff3f87fcc5606140b8099a4c3e5c9c5b6018bf77140b"} Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.246766 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.246811 4901 scope.go:117] "RemoveContainer" containerID="27f8ed051508e4a2b487b45d1ea8a8c1f079961cc586fa4477ca4c57955eacf4" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.290780 4901 scope.go:117] "RemoveContainer" containerID="52a12ab497d25afee14f76273934d866a47cd5cadcffe186af735598ea081bd8" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.293769 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.308597 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.320337 4901 scope.go:117] "RemoveContainer" containerID="004dce19b1d6b44ebfe04d914c139900b5ed50a7fc36eb317cc6a90ed27b7c72" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.336504 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337168 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="prometheus" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337197 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="prometheus" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337212 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="registry-server" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337221 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="registry-server" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337240 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="thanos-sidecar" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337249 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="thanos-sidecar" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337271 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="init-config-reloader" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337279 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="init-config-reloader" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337286 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="extract-content" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337293 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="extract-content" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337307 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="config-reloader" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337313 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="config-reloader" Feb 02 11:42:35 crc kubenswrapper[4901]: E0202 11:42:35.337328 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="extract-utilities" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337334 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="extract-utilities" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337592 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fb53d5-b3bf-4018-8a67-0250b54a44fc" containerName="registry-server" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337626 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="thanos-sidecar" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337641 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="prometheus" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.337652 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="dace040f-2ecc-4429-8a62-187c719781bc" containerName="config-reloader" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.339818 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.344737 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9pb" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.345355 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.345551 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.345663 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.345734 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.345757 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.346045 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.356160 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.357482 4901 scope.go:117] "RemoveContainer" containerID="02a2b9d039704382c3774f513fdae1b4a5a2e322c3db6cf9bacad654bd63116b" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.360716 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371108 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371150 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371177 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371278 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371308 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371330 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371350 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371369 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371403 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.371862 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.372226 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.372268 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.372372 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmjx\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-kube-api-access-gzmjx\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.388765 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.473785 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.473853 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.473900 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.473928 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.473960 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474000 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474058 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474122 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474142 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474170 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmjx\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-kube-api-access-gzmjx\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474217 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474240 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.474273 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.475600 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.476624 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.479669 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.483371 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.583152 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.583550 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.584282 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.584921 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.584603 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.584397 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.585116 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.587412 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.591041 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmjx\" (UniqueName: \"kubernetes.io/projected/2632cf76-38c8-44ca-8ac3-2dd6e635fcdb-kube-api-access-gzmjx\") pod \"prometheus-metric-storage-0\" (UID: \"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb\") " pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.676360 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 11:42:35 crc kubenswrapper[4901]: I0202 11:42:35.692127 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dace040f-2ecc-4429-8a62-187c719781bc" path="/var/lib/kubelet/pods/dace040f-2ecc-4429-8a62-187c719781bc/volumes" Feb 02 11:42:36 crc kubenswrapper[4901]: W0202 11:42:36.326358 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2632cf76_38c8_44ca_8ac3_2dd6e635fcdb.slice/crio-dfb44165d3db20bf1a2f39a33feaff4172b9b7ed644fa05d6a08d457b3e619e5 WatchSource:0}: Error finding container dfb44165d3db20bf1a2f39a33feaff4172b9b7ed644fa05d6a08d457b3e619e5: Status 404 returned error can't find the container with id dfb44165d3db20bf1a2f39a33feaff4172b9b7ed644fa05d6a08d457b3e619e5 Feb 02 11:42:36 crc kubenswrapper[4901]: I0202 11:42:36.328523 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.271656 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerStarted","Data":"dfb44165d3db20bf1a2f39a33feaff4172b9b7ed644fa05d6a08d457b3e619e5"} Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.837698 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.837762 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.837817 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.838797 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:42:37 crc kubenswrapper[4901]: I0202 11:42:37.838846 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" gracePeriod=600 Feb 02 11:42:38 crc kubenswrapper[4901]: E0202 11:42:38.024112 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:42:38 crc kubenswrapper[4901]: I0202 11:42:38.285144 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" exitCode=0 Feb 02 11:42:38 crc kubenswrapper[4901]: I0202 11:42:38.285281 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435"} Feb 02 11:42:38 crc kubenswrapper[4901]: I0202 11:42:38.285407 4901 scope.go:117] "RemoveContainer" containerID="77ae632c1dc9582b47d9b1c6f2b3855d4f1bb67c07c8c24b776bdf2334353dd1" Feb 02 11:42:38 crc kubenswrapper[4901]: I0202 11:42:38.286452 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:42:38 crc kubenswrapper[4901]: E0202 11:42:38.286840 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:42:41 crc kubenswrapper[4901]: I0202 11:42:41.322440 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerStarted","Data":"b204f6f87446cb61115975e2603375c6a7cc9a3d4c7f4faa2a13fa29e976ccef"} Feb 02 11:42:49 crc kubenswrapper[4901]: I0202 11:42:49.427848 4901 generic.go:334] "Generic (PLEG): container finished" podID="2632cf76-38c8-44ca-8ac3-2dd6e635fcdb" containerID="b204f6f87446cb61115975e2603375c6a7cc9a3d4c7f4faa2a13fa29e976ccef" exitCode=0 Feb 02 11:42:49 crc kubenswrapper[4901]: I0202 11:42:49.427956 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerDied","Data":"b204f6f87446cb61115975e2603375c6a7cc9a3d4c7f4faa2a13fa29e976ccef"} Feb 02 11:42:50 crc kubenswrapper[4901]: I0202 11:42:50.441361 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerStarted","Data":"08b1999ec12fea22ead09df546f945557eac9fcdd13f0e55f0be3c03dd7def63"} Feb 02 11:42:53 crc kubenswrapper[4901]: I0202 11:42:53.688423 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:42:53 crc kubenswrapper[4901]: E0202 11:42:53.689601 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:42:54 crc kubenswrapper[4901]: I0202 11:42:54.521026 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerStarted","Data":"665ef960e6077d60c2c8d14137d08edd310904669cb506381bdc74904402c9ba"} Feb 02 11:42:54 crc kubenswrapper[4901]: I0202 11:42:54.521759 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2632cf76-38c8-44ca-8ac3-2dd6e635fcdb","Type":"ContainerStarted","Data":"489dbcc9ae653400b88fe82150d76fc4130b5780c258157671fa83356708b556"} Feb 02 11:42:54 crc kubenswrapper[4901]: I0202 11:42:54.550249 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.550224499 podStartE2EDuration="19.550224499s" podCreationTimestamp="2026-02-02 11:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:42:54.545426203 +0000 UTC m=+3861.563766319" watchObservedRunningTime="2026-02-02 11:42:54.550224499 +0000 UTC m=+3861.568564595" Feb 02 11:42:55 crc kubenswrapper[4901]: I0202 11:42:55.690678 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 11:43:04 crc kubenswrapper[4901]: I0202 11:43:04.677271 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:43:04 crc kubenswrapper[4901]: E0202 11:43:04.678406 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:43:05 crc kubenswrapper[4901]: I0202 11:43:05.690498 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 11:43:05 crc kubenswrapper[4901]: I0202 11:43:05.690782 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 11:43:05 crc kubenswrapper[4901]: I0202 11:43:05.697988 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 11:43:15 crc kubenswrapper[4901]: I0202 11:43:15.677430 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:43:15 crc kubenswrapper[4901]: E0202 11:43:15.678545 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:43:28 crc kubenswrapper[4901]: I0202 11:43:28.677498 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:43:28 crc kubenswrapper[4901]: E0202 11:43:28.679017 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:43:42 crc kubenswrapper[4901]: I0202 11:43:42.679038 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:43:42 crc kubenswrapper[4901]: E0202 11:43:42.680428 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:43:56 crc kubenswrapper[4901]: I0202 11:43:56.677615 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:43:56 crc kubenswrapper[4901]: E0202 11:43:56.678610 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:44:10 crc kubenswrapper[4901]: I0202 11:44:10.677222 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:44:10 crc kubenswrapper[4901]: E0202 11:44:10.681080 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:44:21 crc kubenswrapper[4901]: I0202 11:44:21.677317 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:44:21 crc kubenswrapper[4901]: E0202 11:44:21.678358 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:44:34 crc kubenswrapper[4901]: I0202 11:44:34.677489 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:44:34 crc kubenswrapper[4901]: E0202 11:44:34.678362 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:44:48 crc kubenswrapper[4901]: I0202 11:44:48.677473 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:44:48 crc kubenswrapper[4901]: E0202 11:44:48.678720 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.224882 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s"] Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.227317 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.230409 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.230685 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.236386 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s"] Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.372385 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58mn\" (UniqueName: \"kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.372461 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.372914 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.476385 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58mn\" (UniqueName: \"kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.476506 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.476656 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.478637 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.488630 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.508009 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58mn\" (UniqueName: \"kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn\") pod \"collect-profiles-29500545-cl86s\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:00 crc kubenswrapper[4901]: I0202 11:45:00.549032 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:01 crc kubenswrapper[4901]: I0202 11:45:01.064956 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s"] Feb 02 11:45:01 crc kubenswrapper[4901]: I0202 11:45:01.677468 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:45:01 crc kubenswrapper[4901]: E0202 11:45:01.678291 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:45:01 crc kubenswrapper[4901]: I0202 11:45:01.997430 4901 generic.go:334] "Generic (PLEG): container finished" podID="87c1f865-3761-4230-a98e-450dbfd5446d" containerID="572ce06f423813ff1b06ae9e7bd77e21cff11cee8230499a236ef1fd3b0caa08" exitCode=0 Feb 02 11:45:01 crc kubenswrapper[4901]: I0202 11:45:01.997511 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" event={"ID":"87c1f865-3761-4230-a98e-450dbfd5446d","Type":"ContainerDied","Data":"572ce06f423813ff1b06ae9e7bd77e21cff11cee8230499a236ef1fd3b0caa08"} Feb 02 11:45:01 crc kubenswrapper[4901]: I0202 11:45:01.997580 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" event={"ID":"87c1f865-3761-4230-a98e-450dbfd5446d","Type":"ContainerStarted","Data":"34c15fb3203d9345d5d1a7dd651b3e0469853fee8b2ce13689dbfd00a647f354"} Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.372713 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.459416 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume\") pod \"87c1f865-3761-4230-a98e-450dbfd5446d\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.459905 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume\") pod \"87c1f865-3761-4230-a98e-450dbfd5446d\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.459999 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t58mn\" (UniqueName: \"kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn\") pod \"87c1f865-3761-4230-a98e-450dbfd5446d\" (UID: \"87c1f865-3761-4230-a98e-450dbfd5446d\") " Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.460667 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume" (OuterVolumeSpecName: "config-volume") pod "87c1f865-3761-4230-a98e-450dbfd5446d" (UID: "87c1f865-3761-4230-a98e-450dbfd5446d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.470207 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87c1f865-3761-4230-a98e-450dbfd5446d" (UID: "87c1f865-3761-4230-a98e-450dbfd5446d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.471546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn" (OuterVolumeSpecName: "kube-api-access-t58mn") pod "87c1f865-3761-4230-a98e-450dbfd5446d" (UID: "87c1f865-3761-4230-a98e-450dbfd5446d"). InnerVolumeSpecName "kube-api-access-t58mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.562768 4901 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87c1f865-3761-4230-a98e-450dbfd5446d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.563138 4901 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87c1f865-3761-4230-a98e-450dbfd5446d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4901]: I0202 11:45:03.563201 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t58mn\" (UniqueName: \"kubernetes.io/projected/87c1f865-3761-4230-a98e-450dbfd5446d-kube-api-access-t58mn\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:04 crc kubenswrapper[4901]: I0202 11:45:04.024697 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" Feb 02 11:45:04 crc kubenswrapper[4901]: I0202 11:45:04.024554 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-cl86s" event={"ID":"87c1f865-3761-4230-a98e-450dbfd5446d","Type":"ContainerDied","Data":"34c15fb3203d9345d5d1a7dd651b3e0469853fee8b2ce13689dbfd00a647f354"} Feb 02 11:45:04 crc kubenswrapper[4901]: I0202 11:45:04.025339 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c15fb3203d9345d5d1a7dd651b3e0469853fee8b2ce13689dbfd00a647f354" Feb 02 11:45:04 crc kubenswrapper[4901]: I0202 11:45:04.470384 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql"] Feb 02 11:45:04 crc kubenswrapper[4901]: I0202 11:45:04.481689 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-7xwql"] Feb 02 11:45:05 crc kubenswrapper[4901]: I0202 11:45:05.695122 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3db2272-ebdc-4bd3-8652-d0d46be00772" path="/var/lib/kubelet/pods/e3db2272-ebdc-4bd3-8652-d0d46be00772/volumes" Feb 02 11:45:14 crc kubenswrapper[4901]: I0202 11:45:14.678758 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:45:14 crc kubenswrapper[4901]: E0202 11:45:14.679967 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:45:26 crc kubenswrapper[4901]: I0202 11:45:26.676974 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:45:26 crc kubenswrapper[4901]: E0202 11:45:26.678004 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:45:38 crc kubenswrapper[4901]: I0202 11:45:38.677078 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:45:38 crc kubenswrapper[4901]: E0202 11:45:38.678058 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:45:50 crc kubenswrapper[4901]: I0202 11:45:50.806216 4901 scope.go:117] "RemoveContainer" containerID="ac6053f444c70e8772050002e1dc959128fa29aa137594e0ae65c27dbdbc577a" Feb 02 11:45:51 crc kubenswrapper[4901]: I0202 11:45:51.677617 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:45:51 crc kubenswrapper[4901]: E0202 11:45:51.678186 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:46:02 crc kubenswrapper[4901]: I0202 11:46:02.676932 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:46:02 crc kubenswrapper[4901]: E0202 11:46:02.678158 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:46:15 crc kubenswrapper[4901]: I0202 11:46:15.677894 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:46:15 crc kubenswrapper[4901]: E0202 11:46:15.679080 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:46:28 crc kubenswrapper[4901]: I0202 11:46:28.677193 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:46:28 crc kubenswrapper[4901]: E0202 11:46:28.678458 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:46:32 crc kubenswrapper[4901]: I0202 11:46:32.586371 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:46:40 crc kubenswrapper[4901]: I0202 11:46:40.677510 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:46:40 crc kubenswrapper[4901]: E0202 11:46:40.678632 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:46:52 crc kubenswrapper[4901]: I0202 11:46:52.677669 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:46:52 crc kubenswrapper[4901]: E0202 11:46:52.678739 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:47:05 crc kubenswrapper[4901]: I0202 11:47:05.678751 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:47:05 crc kubenswrapper[4901]: E0202 11:47:05.679843 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.698614 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9dw4/must-gather-r4654"] Feb 02 11:47:06 crc kubenswrapper[4901]: E0202 11:47:06.699666 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c1f865-3761-4230-a98e-450dbfd5446d" containerName="collect-profiles" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.699689 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c1f865-3761-4230-a98e-450dbfd5446d" containerName="collect-profiles" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.699896 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c1f865-3761-4230-a98e-450dbfd5446d" containerName="collect-profiles" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.701310 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.703842 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t9dw4"/"openshift-service-ca.crt" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.704108 4901 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t9dw4"/"kube-root-ca.crt" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.740385 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t9dw4/must-gather-r4654"] Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.866337 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5v8\" (UniqueName: \"kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.866410 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.968906 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5v8\" (UniqueName: \"kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.968980 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.969584 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:06 crc kubenswrapper[4901]: I0202 11:47:06.991953 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5v8\" (UniqueName: \"kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8\") pod \"must-gather-r4654\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:07 crc kubenswrapper[4901]: I0202 11:47:07.035534 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:47:07 crc kubenswrapper[4901]: I0202 11:47:07.661204 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t9dw4/must-gather-r4654"] Feb 02 11:47:07 crc kubenswrapper[4901]: I0202 11:47:07.664182 4901 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:47:08 crc kubenswrapper[4901]: I0202 11:47:08.254637 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/must-gather-r4654" event={"ID":"2e79fb4d-bc49-4cff-a362-ef307ef70412","Type":"ContainerStarted","Data":"faff059a53beb4aa648e123f5a53f1f38a2cd6caea49f141d597c10eb10dcd35"} Feb 02 11:47:14 crc kubenswrapper[4901]: I0202 11:47:14.377605 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/must-gather-r4654" event={"ID":"2e79fb4d-bc49-4cff-a362-ef307ef70412","Type":"ContainerStarted","Data":"39e747d5c9bfc3db9032a65e1c0d7d365078b45e4b60c820158e1280509fcfb9"} Feb 02 11:47:14 crc kubenswrapper[4901]: I0202 11:47:14.378258 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/must-gather-r4654" event={"ID":"2e79fb4d-bc49-4cff-a362-ef307ef70412","Type":"ContainerStarted","Data":"11a7caca7022eaa7cd9389178815a6e4321dd1e73e4a64c9ba15211741a05c37"} Feb 02 11:47:14 crc kubenswrapper[4901]: I0202 11:47:14.406984 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t9dw4/must-gather-r4654" podStartSLOduration=2.905310091 podStartE2EDuration="8.406961275s" podCreationTimestamp="2026-02-02 11:47:06 +0000 UTC" firstStartedPulling="2026-02-02 11:47:07.664094486 +0000 UTC m=+4114.682434582" lastFinishedPulling="2026-02-02 11:47:13.16574568 +0000 UTC m=+4120.184085766" observedRunningTime="2026-02-02 11:47:14.401293927 +0000 UTC m=+4121.419634033" watchObservedRunningTime="2026-02-02 11:47:14.406961275 +0000 UTC m=+4121.425301371" Feb 02 11:47:16 crc kubenswrapper[4901]: I0202 11:47:16.678037 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:47:16 crc kubenswrapper[4901]: E0202 11:47:16.679072 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.171290 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-745k5"] Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.173443 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.178324 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t9dw4"/"default-dockercfg-q8j2l" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.342984 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.343540 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxcj\" (UniqueName: \"kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.446145 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.446330 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxcj\" (UniqueName: \"kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.446411 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.478258 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxcj\" (UniqueName: \"kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj\") pod \"crc-debug-745k5\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:19 crc kubenswrapper[4901]: I0202 11:47:19.495123 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:20 crc kubenswrapper[4901]: I0202 11:47:20.446145 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/crc-debug-745k5" event={"ID":"d0b53df4-adcd-40d3-b852-2003230a3e28","Type":"ContainerStarted","Data":"c94ec554061527e91091b85881a5d821089f4486fbf0dbff323a37dc7c53ef66"} Feb 02 11:47:31 crc kubenswrapper[4901]: I0202 11:47:31.677459 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:47:31 crc kubenswrapper[4901]: E0202 11:47:31.678417 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:47:32 crc kubenswrapper[4901]: I0202 11:47:32.609419 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/crc-debug-745k5" event={"ID":"d0b53df4-adcd-40d3-b852-2003230a3e28","Type":"ContainerStarted","Data":"38351da7203a316bb88a9bc4bc9fd5b527c4da7056320b778191ec093675b25c"} Feb 02 11:47:32 crc kubenswrapper[4901]: I0202 11:47:32.635145 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t9dw4/crc-debug-745k5" podStartSLOduration=1.236011514 podStartE2EDuration="13.635119391s" podCreationTimestamp="2026-02-02 11:47:19 +0000 UTC" firstStartedPulling="2026-02-02 11:47:19.537300414 +0000 UTC m=+4126.555640510" lastFinishedPulling="2026-02-02 11:47:31.936408291 +0000 UTC m=+4138.954748387" observedRunningTime="2026-02-02 11:47:32.627278691 +0000 UTC m=+4139.645618807" watchObservedRunningTime="2026-02-02 11:47:32.635119391 +0000 UTC m=+4139.653459487" Feb 02 11:47:43 crc kubenswrapper[4901]: I0202 11:47:43.684750 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:47:45 crc kubenswrapper[4901]: I0202 11:47:45.828203 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76"} Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.685515 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.688937 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.727126 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.776132 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.776340 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.776407 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97s7g\" (UniqueName: \"kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.878778 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97s7g\" (UniqueName: \"kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.878971 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.879049 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.879738 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.879811 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:46 crc kubenswrapper[4901]: I0202 11:47:46.911579 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97s7g\" (UniqueName: \"kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g\") pod \"certified-operators-jblv6\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:47 crc kubenswrapper[4901]: I0202 11:47:47.013953 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:47 crc kubenswrapper[4901]: I0202 11:47:47.655128 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:47:47 crc kubenswrapper[4901]: W0202 11:47:47.688907 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2721aee1_0262_4464_8387_8744fca3b449.slice/crio-955b3874d6cde5fc879b2d7e84dbbbde6e397adf8d62c917c7d10b2a5e20ca91 WatchSource:0}: Error finding container 955b3874d6cde5fc879b2d7e84dbbbde6e397adf8d62c917c7d10b2a5e20ca91: Status 404 returned error can't find the container with id 955b3874d6cde5fc879b2d7e84dbbbde6e397adf8d62c917c7d10b2a5e20ca91 Feb 02 11:47:47 crc kubenswrapper[4901]: I0202 11:47:47.872811 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerStarted","Data":"955b3874d6cde5fc879b2d7e84dbbbde6e397adf8d62c917c7d10b2a5e20ca91"} Feb 02 11:47:48 crc kubenswrapper[4901]: I0202 11:47:48.885630 4901 generic.go:334] "Generic (PLEG): container finished" podID="2721aee1-0262-4464-8387-8744fca3b449" containerID="768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a" exitCode=0 Feb 02 11:47:48 crc kubenswrapper[4901]: I0202 11:47:48.885749 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerDied","Data":"768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a"} Feb 02 11:47:52 crc kubenswrapper[4901]: I0202 11:47:52.932622 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerStarted","Data":"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23"} Feb 02 11:47:53 crc kubenswrapper[4901]: I0202 11:47:53.945330 4901 generic.go:334] "Generic (PLEG): container finished" podID="2721aee1-0262-4464-8387-8744fca3b449" containerID="72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23" exitCode=0 Feb 02 11:47:53 crc kubenswrapper[4901]: I0202 11:47:53.945523 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerDied","Data":"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23"} Feb 02 11:47:54 crc kubenswrapper[4901]: I0202 11:47:54.961244 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerStarted","Data":"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86"} Feb 02 11:47:54 crc kubenswrapper[4901]: I0202 11:47:54.964130 4901 generic.go:334] "Generic (PLEG): container finished" podID="d0b53df4-adcd-40d3-b852-2003230a3e28" containerID="38351da7203a316bb88a9bc4bc9fd5b527c4da7056320b778191ec093675b25c" exitCode=0 Feb 02 11:47:54 crc kubenswrapper[4901]: I0202 11:47:54.964202 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/crc-debug-745k5" event={"ID":"d0b53df4-adcd-40d3-b852-2003230a3e28","Type":"ContainerDied","Data":"38351da7203a316bb88a9bc4bc9fd5b527c4da7056320b778191ec093675b25c"} Feb 02 11:47:54 crc kubenswrapper[4901]: I0202 11:47:54.994372 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jblv6" podStartSLOduration=3.320246063 podStartE2EDuration="8.994343218s" podCreationTimestamp="2026-02-02 11:47:46 +0000 UTC" firstStartedPulling="2026-02-02 11:47:48.895856845 +0000 UTC m=+4155.914196941" lastFinishedPulling="2026-02-02 11:47:54.569954 +0000 UTC m=+4161.588294096" observedRunningTime="2026-02-02 11:47:54.993280202 +0000 UTC m=+4162.011620338" watchObservedRunningTime="2026-02-02 11:47:54.994343218 +0000 UTC m=+4162.012683314" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.166656 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.222409 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-745k5"] Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.242306 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-745k5"] Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.250178 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host\") pod \"d0b53df4-adcd-40d3-b852-2003230a3e28\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.250264 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host" (OuterVolumeSpecName: "host") pod "d0b53df4-adcd-40d3-b852-2003230a3e28" (UID: "d0b53df4-adcd-40d3-b852-2003230a3e28"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.250836 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxcj\" (UniqueName: \"kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj\") pod \"d0b53df4-adcd-40d3-b852-2003230a3e28\" (UID: \"d0b53df4-adcd-40d3-b852-2003230a3e28\") " Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.251669 4901 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0b53df4-adcd-40d3-b852-2003230a3e28-host\") on node \"crc\" DevicePath \"\"" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.278229 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj" (OuterVolumeSpecName: "kube-api-access-8cxcj") pod "d0b53df4-adcd-40d3-b852-2003230a3e28" (UID: "d0b53df4-adcd-40d3-b852-2003230a3e28"). InnerVolumeSpecName "kube-api-access-8cxcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.354266 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cxcj\" (UniqueName: \"kubernetes.io/projected/d0b53df4-adcd-40d3-b852-2003230a3e28-kube-api-access-8cxcj\") on node \"crc\" DevicePath \"\"" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.987905 4901 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c94ec554061527e91091b85881a5d821089f4486fbf0dbff323a37dc7c53ef66" Feb 02 11:47:56 crc kubenswrapper[4901]: I0202 11:47:56.988064 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-745k5" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.014699 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.014839 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.071311 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.594154 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-qx5th"] Feb 02 11:47:57 crc kubenswrapper[4901]: E0202 11:47:57.594796 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b53df4-adcd-40d3-b852-2003230a3e28" containerName="container-00" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.594812 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b53df4-adcd-40d3-b852-2003230a3e28" containerName="container-00" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.595060 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b53df4-adcd-40d3-b852-2003230a3e28" containerName="container-00" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.595948 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.599888 4901 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t9dw4"/"default-dockercfg-q8j2l" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.691471 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b53df4-adcd-40d3-b852-2003230a3e28" path="/var/lib/kubelet/pods/d0b53df4-adcd-40d3-b852-2003230a3e28/volumes" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.691711 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.691867 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h9jz\" (UniqueName: \"kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.794214 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.794379 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h9jz\" (UniqueName: \"kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.794806 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.825395 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h9jz\" (UniqueName: \"kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz\") pod \"crc-debug-qx5th\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: I0202 11:47:57.918745 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:47:57 crc kubenswrapper[4901]: W0202 11:47:57.968204 4901 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79a97f32_3041_47b9_b369_2dd0c77b7fba.slice/crio-022e97d8fbe9657464bcfdbd0f116b1d8b2afa91f329debf7982893f0bf36087 WatchSource:0}: Error finding container 022e97d8fbe9657464bcfdbd0f116b1d8b2afa91f329debf7982893f0bf36087: Status 404 returned error can't find the container with id 022e97d8fbe9657464bcfdbd0f116b1d8b2afa91f329debf7982893f0bf36087 Feb 02 11:47:58 crc kubenswrapper[4901]: I0202 11:47:58.020030 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" event={"ID":"79a97f32-3041-47b9-b369-2dd0c77b7fba","Type":"ContainerStarted","Data":"022e97d8fbe9657464bcfdbd0f116b1d8b2afa91f329debf7982893f0bf36087"} Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.033545 4901 generic.go:334] "Generic (PLEG): container finished" podID="79a97f32-3041-47b9-b369-2dd0c77b7fba" containerID="cfe4cea521774d8b5cf1aa0a94ca744c2cb9946f59a2fe3f36ec4ea52680c186" exitCode=1 Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.033740 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" event={"ID":"79a97f32-3041-47b9-b369-2dd0c77b7fba","Type":"ContainerDied","Data":"cfe4cea521774d8b5cf1aa0a94ca744c2cb9946f59a2fe3f36ec4ea52680c186"} Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.116754 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-qx5th"] Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.132319 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9dw4/crc-debug-qx5th"] Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.135039 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:47:59 crc kubenswrapper[4901]: I0202 11:47:59.236465 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.593020 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.679210 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h9jz\" (UniqueName: \"kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz\") pod \"79a97f32-3041-47b9-b369-2dd0c77b7fba\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.679278 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host\") pod \"79a97f32-3041-47b9-b369-2dd0c77b7fba\" (UID: \"79a97f32-3041-47b9-b369-2dd0c77b7fba\") " Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.679342 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host" (OuterVolumeSpecName: "host") pod "79a97f32-3041-47b9-b369-2dd0c77b7fba" (UID: "79a97f32-3041-47b9-b369-2dd0c77b7fba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.679875 4901 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/79a97f32-3041-47b9-b369-2dd0c77b7fba-host\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.691008 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz" (OuterVolumeSpecName: "kube-api-access-6h9jz") pod "79a97f32-3041-47b9-b369-2dd0c77b7fba" (UID: "79a97f32-3041-47b9-b369-2dd0c77b7fba"). InnerVolumeSpecName "kube-api-access-6h9jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:48:00 crc kubenswrapper[4901]: I0202 11:48:00.782391 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h9jz\" (UniqueName: \"kubernetes.io/projected/79a97f32-3041-47b9-b369-2dd0c77b7fba-kube-api-access-6h9jz\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.061107 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jblv6" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="registry-server" containerID="cri-o://9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86" gracePeriod=2 Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.061600 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/crc-debug-qx5th" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.062646 4901 scope.go:117] "RemoveContainer" containerID="cfe4cea521774d8b5cf1aa0a94ca744c2cb9946f59a2fe3f36ec4ea52680c186" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.598742 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.696037 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a97f32-3041-47b9-b369-2dd0c77b7fba" path="/var/lib/kubelet/pods/79a97f32-3041-47b9-b369-2dd0c77b7fba/volumes" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.702807 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities\") pod \"2721aee1-0262-4464-8387-8744fca3b449\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.702970 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97s7g\" (UniqueName: \"kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g\") pod \"2721aee1-0262-4464-8387-8744fca3b449\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.703022 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content\") pod \"2721aee1-0262-4464-8387-8744fca3b449\" (UID: \"2721aee1-0262-4464-8387-8744fca3b449\") " Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.704734 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities" (OuterVolumeSpecName: "utilities") pod "2721aee1-0262-4464-8387-8744fca3b449" (UID: "2721aee1-0262-4464-8387-8744fca3b449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.721972 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g" (OuterVolumeSpecName: "kube-api-access-97s7g") pod "2721aee1-0262-4464-8387-8744fca3b449" (UID: "2721aee1-0262-4464-8387-8744fca3b449"). InnerVolumeSpecName "kube-api-access-97s7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.764593 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2721aee1-0262-4464-8387-8744fca3b449" (UID: "2721aee1-0262-4464-8387-8744fca3b449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.806174 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.806213 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97s7g\" (UniqueName: \"kubernetes.io/projected/2721aee1-0262-4464-8387-8744fca3b449-kube-api-access-97s7g\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:01 crc kubenswrapper[4901]: I0202 11:48:01.806225 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2721aee1-0262-4464-8387-8744fca3b449-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.079295 4901 generic.go:334] "Generic (PLEG): container finished" podID="2721aee1-0262-4464-8387-8744fca3b449" containerID="9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86" exitCode=0 Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.079350 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerDied","Data":"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86"} Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.079386 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jblv6" event={"ID":"2721aee1-0262-4464-8387-8744fca3b449","Type":"ContainerDied","Data":"955b3874d6cde5fc879b2d7e84dbbbde6e397adf8d62c917c7d10b2a5e20ca91"} Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.079410 4901 scope.go:117] "RemoveContainer" containerID="9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.079439 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jblv6" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.118681 4901 scope.go:117] "RemoveContainer" containerID="72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.130069 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.141596 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jblv6"] Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.169829 4901 scope.go:117] "RemoveContainer" containerID="768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.222199 4901 scope.go:117] "RemoveContainer" containerID="9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86" Feb 02 11:48:02 crc kubenswrapper[4901]: E0202 11:48:02.223730 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86\": container with ID starting with 9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86 not found: ID does not exist" containerID="9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.223799 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86"} err="failed to get container status \"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86\": rpc error: code = NotFound desc = could not find container \"9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86\": container with ID starting with 9be144c4cf41de7de782ef1aae2149c7060b409efe3804f6ade7d658069adf86 not found: ID does not exist" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.223842 4901 scope.go:117] "RemoveContainer" containerID="72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23" Feb 02 11:48:02 crc kubenswrapper[4901]: E0202 11:48:02.224202 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23\": container with ID starting with 72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23 not found: ID does not exist" containerID="72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.224225 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23"} err="failed to get container status \"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23\": rpc error: code = NotFound desc = could not find container \"72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23\": container with ID starting with 72f769e5eb9ac293933ed417184947151970efbd69b27136cd971ea88da7cc23 not found: ID does not exist" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.224240 4901 scope.go:117] "RemoveContainer" containerID="768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a" Feb 02 11:48:02 crc kubenswrapper[4901]: E0202 11:48:02.225156 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a\": container with ID starting with 768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a not found: ID does not exist" containerID="768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a" Feb 02 11:48:02 crc kubenswrapper[4901]: I0202 11:48:02.225182 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a"} err="failed to get container status \"768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a\": rpc error: code = NotFound desc = could not find container \"768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a\": container with ID starting with 768dcc6983aa10c5c92a1ad322a16bf572b57f9db8f72e00789db07c8a99804a not found: ID does not exist" Feb 02 11:48:03 crc kubenswrapper[4901]: I0202 11:48:03.702807 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2721aee1-0262-4464-8387-8744fca3b449" path="/var/lib/kubelet/pods/2721aee1-0262-4464-8387-8744fca3b449/volumes" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.359155 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:16 crc kubenswrapper[4901]: E0202 11:48:16.360531 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a97f32-3041-47b9-b369-2dd0c77b7fba" containerName="container-00" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360546 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a97f32-3041-47b9-b369-2dd0c77b7fba" containerName="container-00" Feb 02 11:48:16 crc kubenswrapper[4901]: E0202 11:48:16.360580 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="extract-content" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360588 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="extract-content" Feb 02 11:48:16 crc kubenswrapper[4901]: E0202 11:48:16.360624 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="extract-utilities" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360631 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="extract-utilities" Feb 02 11:48:16 crc kubenswrapper[4901]: E0202 11:48:16.360649 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="registry-server" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360655 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="registry-server" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360855 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a97f32-3041-47b9-b369-2dd0c77b7fba" containerName="container-00" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.360876 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="2721aee1-0262-4464-8387-8744fca3b449" containerName="registry-server" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.363234 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.385180 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.449614 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzfr5\" (UniqueName: \"kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.449708 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.449941 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.551417 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.551581 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzfr5\" (UniqueName: \"kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.551847 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.552413 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.553954 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.579611 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzfr5\" (UniqueName: \"kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5\") pod \"redhat-marketplace-jksz9\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:16 crc kubenswrapper[4901]: I0202 11:48:16.692586 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:17 crc kubenswrapper[4901]: I0202 11:48:17.299577 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:18 crc kubenswrapper[4901]: I0202 11:48:18.243613 4901 generic.go:334] "Generic (PLEG): container finished" podID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerID="b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79" exitCode=0 Feb 02 11:48:18 crc kubenswrapper[4901]: I0202 11:48:18.243988 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerDied","Data":"b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79"} Feb 02 11:48:18 crc kubenswrapper[4901]: I0202 11:48:18.244021 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerStarted","Data":"fec9ec55ff79af66747bd2d510fda1b33fa75792c85b1a43aea9aad73270ec20"} Feb 02 11:48:19 crc kubenswrapper[4901]: I0202 11:48:19.257057 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerStarted","Data":"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc"} Feb 02 11:48:20 crc kubenswrapper[4901]: I0202 11:48:20.271525 4901 generic.go:334] "Generic (PLEG): container finished" podID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerID="85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc" exitCode=0 Feb 02 11:48:20 crc kubenswrapper[4901]: I0202 11:48:20.271600 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerDied","Data":"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc"} Feb 02 11:48:21 crc kubenswrapper[4901]: I0202 11:48:21.285308 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerStarted","Data":"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193"} Feb 02 11:48:26 crc kubenswrapper[4901]: I0202 11:48:26.692873 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:26 crc kubenswrapper[4901]: I0202 11:48:26.693383 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:26 crc kubenswrapper[4901]: I0202 11:48:26.750752 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:26 crc kubenswrapper[4901]: I0202 11:48:26.784047 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jksz9" podStartSLOduration=8.277089572 podStartE2EDuration="10.784019198s" podCreationTimestamp="2026-02-02 11:48:16 +0000 UTC" firstStartedPulling="2026-02-02 11:48:18.246523254 +0000 UTC m=+4185.264863350" lastFinishedPulling="2026-02-02 11:48:20.75345288 +0000 UTC m=+4187.771792976" observedRunningTime="2026-02-02 11:48:21.324437969 +0000 UTC m=+4188.342778065" watchObservedRunningTime="2026-02-02 11:48:26.784019198 +0000 UTC m=+4193.802359284" Feb 02 11:48:27 crc kubenswrapper[4901]: I0202 11:48:27.392675 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:27 crc kubenswrapper[4901]: I0202 11:48:27.448305 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:29 crc kubenswrapper[4901]: I0202 11:48:29.368381 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jksz9" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="registry-server" containerID="cri-o://0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193" gracePeriod=2 Feb 02 11:48:29 crc kubenswrapper[4901]: I0202 11:48:29.938841 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.092459 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities\") pod \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.092828 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content\") pod \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.092937 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzfr5\" (UniqueName: \"kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5\") pod \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\" (UID: \"bd98ef40-2acb-4d2a-ac3f-61506d6968cc\") " Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.093438 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities" (OuterVolumeSpecName: "utilities") pod "bd98ef40-2acb-4d2a-ac3f-61506d6968cc" (UID: "bd98ef40-2acb-4d2a-ac3f-61506d6968cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.108933 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5" (OuterVolumeSpecName: "kube-api-access-wzfr5") pod "bd98ef40-2acb-4d2a-ac3f-61506d6968cc" (UID: "bd98ef40-2acb-4d2a-ac3f-61506d6968cc"). InnerVolumeSpecName "kube-api-access-wzfr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.118461 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd98ef40-2acb-4d2a-ac3f-61506d6968cc" (UID: "bd98ef40-2acb-4d2a-ac3f-61506d6968cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.196122 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.196172 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.196193 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzfr5\" (UniqueName: \"kubernetes.io/projected/bd98ef40-2acb-4d2a-ac3f-61506d6968cc-kube-api-access-wzfr5\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.381687 4901 generic.go:334] "Generic (PLEG): container finished" podID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerID="0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193" exitCode=0 Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.381742 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerDied","Data":"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193"} Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.381765 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jksz9" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.381781 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jksz9" event={"ID":"bd98ef40-2acb-4d2a-ac3f-61506d6968cc","Type":"ContainerDied","Data":"fec9ec55ff79af66747bd2d510fda1b33fa75792c85b1a43aea9aad73270ec20"} Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.381803 4901 scope.go:117] "RemoveContainer" containerID="0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.429237 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.441628 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jksz9"] Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.448116 4901 scope.go:117] "RemoveContainer" containerID="85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.495376 4901 scope.go:117] "RemoveContainer" containerID="b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.542504 4901 scope.go:117] "RemoveContainer" containerID="0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193" Feb 02 11:48:30 crc kubenswrapper[4901]: E0202 11:48:30.543604 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193\": container with ID starting with 0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193 not found: ID does not exist" containerID="0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.543665 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193"} err="failed to get container status \"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193\": rpc error: code = NotFound desc = could not find container \"0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193\": container with ID starting with 0adc6d22cb5a0966d2cb487784c936833be7072a584372ace05cdd00046e5193 not found: ID does not exist" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.543701 4901 scope.go:117] "RemoveContainer" containerID="85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc" Feb 02 11:48:30 crc kubenswrapper[4901]: E0202 11:48:30.544179 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc\": container with ID starting with 85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc not found: ID does not exist" containerID="85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.544217 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc"} err="failed to get container status \"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc\": rpc error: code = NotFound desc = could not find container \"85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc\": container with ID starting with 85dc7789b38fdcf413eea84e186f186a662563da6bf5f816880d4717babce7cc not found: ID does not exist" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.544261 4901 scope.go:117] "RemoveContainer" containerID="b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79" Feb 02 11:48:30 crc kubenswrapper[4901]: E0202 11:48:30.544825 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79\": container with ID starting with b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79 not found: ID does not exist" containerID="b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79" Feb 02 11:48:30 crc kubenswrapper[4901]: I0202 11:48:30.544865 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79"} err="failed to get container status \"b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79\": rpc error: code = NotFound desc = could not find container \"b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79\": container with ID starting with b506a103b304e2eb4e08574785bdd7dfb9108e833f0a9314df01e001e5eeca79 not found: ID does not exist" Feb 02 11:48:31 crc kubenswrapper[4901]: I0202 11:48:31.691026 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" path="/var/lib/kubelet/pods/bd98ef40-2acb-4d2a-ac3f-61506d6968cc/volumes" Feb 02 11:49:03 crc kubenswrapper[4901]: I0202 11:49:03.672317 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b26cf498-bc66-40f0-bc8f-5f89ac251655/init-config-reloader/0.log" Feb 02 11:49:03 crc kubenswrapper[4901]: I0202 11:49:03.864891 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b26cf498-bc66-40f0-bc8f-5f89ac251655/alertmanager/0.log" Feb 02 11:49:03 crc kubenswrapper[4901]: I0202 11:49:03.891705 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b26cf498-bc66-40f0-bc8f-5f89ac251655/init-config-reloader/0.log" Feb 02 11:49:03 crc kubenswrapper[4901]: I0202 11:49:03.921826 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b26cf498-bc66-40f0-bc8f-5f89ac251655/config-reloader/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.151216 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5964cf7bcd-rmmzk_fb942871-a2e5-420e-a570-962659f75886/barbican-api-log/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.163917 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5964cf7bcd-rmmzk_fb942871-a2e5-420e-a570-962659f75886/barbican-api/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.280097 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4d58f5d-kkj8n_7c1d42a9-290c-4bff-b0cb-89507651dae4/barbican-keystone-listener/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.402401 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b4d58f5d-kkj8n_7c1d42a9-290c-4bff-b0cb-89507651dae4/barbican-keystone-listener-log/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.512778 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558d599c79-dct6d_1644a512-eb6b-4023-9961-f42ff4bdbbe6/barbican-worker/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.595799 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558d599c79-dct6d_1644a512-eb6b-4023-9961-f42ff4bdbbe6/barbican-worker-log/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.803912 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pjsjw_665b7c24-97eb-482c-9a0b-1492cfa2d84d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:04 crc kubenswrapper[4901]: I0202 11:49:04.870772 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b/ceilometer-central-agent/0.log" Feb 02 11:49:05 crc kubenswrapper[4901]: I0202 11:49:05.022703 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b/ceilometer-notification-agent/0.log" Feb 02 11:49:05 crc kubenswrapper[4901]: I0202 11:49:05.026908 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b/proxy-httpd/0.log" Feb 02 11:49:05 crc kubenswrapper[4901]: I0202 11:49:05.073835 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bc3d2b0-5b10-41cb-89d8-2b06a5d4254b/sg-core/0.log" Feb 02 11:49:05 crc kubenswrapper[4901]: I0202 11:49:05.280861 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce854019-a345-4a85-9211-eedd7e33dff3/cinder-api-log/0.log" Feb 02 11:49:05 crc kubenswrapper[4901]: I0202 11:49:05.340247 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_ce854019-a345-4a85-9211-eedd7e33dff3/cinder-api/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.119403 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fec3da0b-c2dc-4269-b621-19b380d9b92d/cinder-scheduler/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.137686 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fec3da0b-c2dc-4269-b621-19b380d9b92d/probe/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.395556 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vwmn9_2c354603-4591-439e-b60a-4c46e1b31678/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.477139 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qgw5h_b013b75b-71ef-4701-9f44-298496253710/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.599077 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-6pmsr_b0c42695-0f2a-43f2-925b-90f704255c79/init/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.832580 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-6pmsr_b0c42695-0f2a-43f2-925b-90f704255c79/init/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.892501 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bqz4b_7e499a01-9c9c-44ca-a1d2-e35912fba103/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:06 crc kubenswrapper[4901]: I0202 11:49:06.895463 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d75f767dc-6pmsr_b0c42695-0f2a-43f2-925b-90f704255c79/dnsmasq-dns/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.060658 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5731a76d-bd25-4a51-acfc-7dfd031eef35/glance-httpd/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.119184 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5731a76d-bd25-4a51-acfc-7dfd031eef35/glance-log/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.169992 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d303ae2-764c-42f1-afaa-d099c91b5ac4/glance-httpd/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.330228 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_4d303ae2-764c-42f1-afaa-d099c91b5ac4/glance-log/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.704953 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-79dbc4cd68-ltht2_b47d3039-22e1-42c8-b23f-9c5f6dcb51f6/heat-engine/0.log" Feb 02 11:49:07 crc kubenswrapper[4901]: I0202 11:49:07.893796 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6548466b85-x76qz_5acf34a1-ce29-4301-bd5e-e6792dff572d/heat-api/0.log" Feb 02 11:49:08 crc kubenswrapper[4901]: I0202 11:49:08.035619 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7dcd8c5f77-l2jdd_ed964b8c-458f-4b9f-8363-55627008bc75/heat-cfnapi/0.log" Feb 02 11:49:08 crc kubenswrapper[4901]: I0202 11:49:08.846934 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-smpsd_cdafb787-54a7-457b-8c61-94ddf99cbb8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:08 crc kubenswrapper[4901]: I0202 11:49:08.916764 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-vws9z_1f654258-65f3-42c5-a4d8-c131145a91cc/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:08 crc kubenswrapper[4901]: I0202 11:49:08.953269 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d7d985874-pzxvf_9c918cc4-c647-4e53-8800-27ea182ea861/keystone-api/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.119291 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29500501-cwls7_294b9c02-1398-4377-bb01-27ff64ba9c08/keystone-cron/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.204301 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8faaac48-521a-47fc-b480-d941fd41be94/kube-state-metrics/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.389217 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5ksr7_688ef63a-dd78-45da-84ba-f5bc28c6ae81/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.618432 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d89c9dff9-fzvln_172bde5c-4b76-4f03-b899-4a395581a9f5/neutron-api/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.661733 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d89c9dff9-fzvln_172bde5c-4b76-4f03-b899-4a395581a9f5/neutron-httpd/0.log" Feb 02 11:49:09 crc kubenswrapper[4901]: I0202 11:49:09.794662 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-t79t6_8911f17d-aca5-4056-90a0-f6351983e4bf/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:10 crc kubenswrapper[4901]: I0202 11:49:10.287863 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_93285f41-662a-47d2-a011-676f740a2914/nova-api-log/0.log" Feb 02 11:49:10 crc kubenswrapper[4901]: I0202 11:49:10.331364 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_256480df-da06-439c-9e9b-41c8aba434a6/nova-cell0-conductor-conductor/0.log" Feb 02 11:49:10 crc kubenswrapper[4901]: I0202 11:49:10.540772 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_93285f41-662a-47d2-a011-676f740a2914/nova-api-api/0.log" Feb 02 11:49:10 crc kubenswrapper[4901]: I0202 11:49:10.664957 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7f1bc22b-987a-4dd9-a0c9-42383e63fd38/nova-cell1-conductor-conductor/0.log" Feb 02 11:49:10 crc kubenswrapper[4901]: I0202 11:49:10.731731 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_24d88cb1-d10e-4ade-91f2-48d3ed40f873/nova-cell1-novncproxy-novncproxy/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.020785 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-g4dt6_9bca60a0-6b7b-4513-bec4-c2aa578b5607/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.152619 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4135f49b-1390-4600-8855-c9311c0cdf11/nova-metadata-log/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.435876 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c039a87a-6a86-4dc2-8760-363c1e2f62d8/nova-scheduler-scheduler/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.543074 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c56659df-71f1-4dbb-819f-b71277070b0e/mysql-bootstrap/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.813014 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c56659df-71f1-4dbb-819f-b71277070b0e/galera/0.log" Feb 02 11:49:11 crc kubenswrapper[4901]: I0202 11:49:11.831523 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c56659df-71f1-4dbb-819f-b71277070b0e/mysql-bootstrap/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.090502 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93ecc7a4-4c23-488f-8d75-8fee0246afe4/mysql-bootstrap/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.314692 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93ecc7a4-4c23-488f-8d75-8fee0246afe4/mysql-bootstrap/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.329040 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_93ecc7a4-4c23-488f-8d75-8fee0246afe4/galera/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.580270 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b0d18719-d65d-4624-9696-b876ab4b3e85/openstackclient/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.587760 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nt5zf_0c692cfe-56e1-4bba-b3e7-bf1d713d343d/openstack-network-exporter/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.847000 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ckvwh_8984971d-92d8-4bf2-b07f-f8af49f67ece/ovsdb-server-init/0.log" Feb 02 11:49:12 crc kubenswrapper[4901]: I0202 11:49:12.928972 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4135f49b-1390-4600-8855-c9311c0cdf11/nova-metadata-metadata/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.108383 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ckvwh_8984971d-92d8-4bf2-b07f-f8af49f67ece/ovsdb-server-init/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.159836 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ckvwh_8984971d-92d8-4bf2-b07f-f8af49f67ece/ovsdb-server/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.161115 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-ckvwh_8984971d-92d8-4bf2-b07f-f8af49f67ece/ovs-vswitchd/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.371429 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rtv7m_bdaf682c-0830-4cd9-ba3d-5eb615ac9cb7/ovn-controller/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.521904 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mj2hb_e169f03d-d9b9-4e8e-8847-65e846bf5722/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.679312 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_52d82257-56bf-48c9-b8e1-8cdab6278855/openstack-network-exporter/0.log" Feb 02 11:49:13 crc kubenswrapper[4901]: I0202 11:49:13.788371 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_52d82257-56bf-48c9-b8e1-8cdab6278855/ovn-northd/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.136760 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5d81b635-e8be-4199-827b-02ea68a3de3b/openstack-network-exporter/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.185860 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5d81b635-e8be-4199-827b-02ea68a3de3b/ovsdbserver-nb/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.350189 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_867ad59f-d6b4-42da-90e8-9ac943b12aaf/openstack-network-exporter/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.384271 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_867ad59f-d6b4-42da-90e8-9ac943b12aaf/ovsdbserver-sb/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.747235 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ff445bb86-z9hcr_37a5ab35-b3f0-4065-bcbe-5c784dd6d02c/placement-api/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.759784 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ff445bb86-z9hcr_37a5ab35-b3f0-4065-bcbe-5c784dd6d02c/placement-log/0.log" Feb 02 11:49:14 crc kubenswrapper[4901]: I0202 11:49:14.831653 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2632cf76-38c8-44ca-8ac3-2dd6e635fcdb/init-config-reloader/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.105438 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2632cf76-38c8-44ca-8ac3-2dd6e635fcdb/init-config-reloader/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.140990 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2632cf76-38c8-44ca-8ac3-2dd6e635fcdb/thanos-sidecar/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.146250 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2632cf76-38c8-44ca-8ac3-2dd6e635fcdb/config-reloader/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.152991 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_2632cf76-38c8-44ca-8ac3-2dd6e635fcdb/prometheus/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.371355 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e36bedd2-6698-4981-b0cf-a278a9ce7258/setup-container/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.643318 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e36bedd2-6698-4981-b0cf-a278a9ce7258/setup-container/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.735021 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e36bedd2-6698-4981-b0cf-a278a9ce7258/rabbitmq/0.log" Feb 02 11:49:15 crc kubenswrapper[4901]: I0202 11:49:15.735995 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cacb5793-beb9-49f9-9438-9613ad472c15/setup-container/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.007178 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cacb5793-beb9-49f9-9438-9613ad472c15/setup-container/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.027934 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_cacb5793-beb9-49f9-9438-9613ad472c15/rabbitmq/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.092781 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mbbt7_3a2b8c88-d6ea-4fe1-9719-ebb0b9d39798/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.383650 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-lpjsg_ab519d3b-c449-4008-b473-ad5e4e5d433e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.420497 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j9z7t_ec5fa81a-9fd8-4adf-886a-2874c6883bc0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.742730 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s5msw_62710228-7012-4485-a5d9-16a9bf369635/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:16 crc kubenswrapper[4901]: I0202 11:49:16.811517 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-75pt4_fedb3e3c-4180-4787-9d18-4fd5af89ad69/ssh-known-hosts-edpm-deployment/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.029339 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-96f76df57-tdlmx_4d2b7428-ac02-4aed-8a90-30cc198e4cca/proxy-server/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.245817 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-96f76df57-tdlmx_4d2b7428-ac02-4aed-8a90-30cc198e4cca/proxy-httpd/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.317795 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tmj6n_882c42d4-7650-4f0d-8973-ba9ddcbb6800/swift-ring-rebalance/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.534107 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/account-reaper/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.542400 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/account-auditor/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.662466 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/account-replicator/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.764554 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/account-server/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.903982 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/container-auditor/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.922593 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/container-replicator/0.log" Feb 02 11:49:17 crc kubenswrapper[4901]: I0202 11:49:17.973715 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/container-server/0.log" Feb 02 11:49:18 crc kubenswrapper[4901]: I0202 11:49:18.435909 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/container-updater/0.log" Feb 02 11:49:18 crc kubenswrapper[4901]: I0202 11:49:18.480118 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/object-auditor/0.log" Feb 02 11:49:18 crc kubenswrapper[4901]: I0202 11:49:18.538469 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/object-replicator/0.log" Feb 02 11:49:18 crc kubenswrapper[4901]: I0202 11:49:18.581326 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/object-expirer/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.157471 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/object-server/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.234994 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/rsync/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.239582 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/object-updater/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.263222 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4b4d5a91-d330-499c-9123-35b58d8c55d5/swift-recon-cron/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.583860 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7vhs7_e2680d00-6499-42bd-ac33-547a56af2392/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:19 crc kubenswrapper[4901]: I0202 11:49:19.614243 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wwz5b_5b1278d5-723d-40a0-a7dc-b85ddcdbdde9/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 02 11:49:26 crc kubenswrapper[4901]: I0202 11:49:26.434824 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_e7551d06-91c3-4652-b042-cf8080c36ce2/memcached/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.130635 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/util/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.379383 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/util/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.385874 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/pull/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.403656 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/pull/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.587371 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/util/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.618066 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/extract/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.618250 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2138b5f13f1f4a89ba3feab0b777bc0d311f9a649e9d064e0c06ba5ed1rkr5x_18261baf-6180-4a18-9250-f282d455e91a/pull/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.880254 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-lh4js_a3d4d0f2-f8ac-4b9d-b78c-7a6c63750fdf/manager/0.log" Feb 02 11:49:53 crc kubenswrapper[4901]: I0202 11:49:53.905926 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-kgsvf_50ac57c2-233a-40b9-9377-c8066412240c/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.103554 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-b6wnv_256717cd-84fb-490a-9945-bed0d1f5ec7f/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.266551 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-rjvw7_52cbccb0-76da-4a69-b33f-6efb03721afe/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.421301 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-sm2xf_3e09ebb7-1669-4027-a2f9-f65176a6a099/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.435038 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-7ssmx_79d346c5-abe4-401c-9aaf-b4814a623c99/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.689096 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-wccg8_f5f1df90-8dfb-4eae-b0bb-6128aab24030/manager/0.log" Feb 02 11:49:54 crc kubenswrapper[4901]: I0202 11:49:54.940134 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-vd4rg_42ecefa7-b29d-4178-82c0-5520874c1d1a/manager/0.log" Feb 02 11:49:55 crc kubenswrapper[4901]: I0202 11:49:55.006677 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-f5dcq_3be47ad9-6e38-4b16-9e57-2311ef26ed5b/manager/0.log" Feb 02 11:49:55 crc kubenswrapper[4901]: I0202 11:49:55.599515 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-qnjn5_c7d3103d-1aa3-4337-8e01-f60aed47ca9b/manager/0.log" Feb 02 11:49:55 crc kubenswrapper[4901]: I0202 11:49:55.667249 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-rmwtz_4d052f6a-df39-4bd2-aee5-8abd7a1a2882/manager/0.log" Feb 02 11:49:55 crc kubenswrapper[4901]: I0202 11:49:55.873603 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-vf462_2077e455-81ea-4c9a-b4cf-1304d990ee88/manager/0.log" Feb 02 11:49:56 crc kubenswrapper[4901]: I0202 11:49:56.024451 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-7ktm2_57f12111-0feb-4c93-8e3a-c0d36dee5184/manager/0.log" Feb 02 11:49:56 crc kubenswrapper[4901]: I0202 11:49:56.108854 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-827zp_c92912da-94c2-41b5-b43c-a136f96dbd1e/manager/0.log" Feb 02 11:49:56 crc kubenswrapper[4901]: I0202 11:49:56.169655 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dhlwss_4c4d76b0-aadf-4949-a131-a43c226e38a2/manager/0.log" Feb 02 11:49:56 crc kubenswrapper[4901]: I0202 11:49:56.586335 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-8684f8699c-l6tlx_dd2442fb-4d95-49c5-b67a-0195ed05bc10/operator/0.log" Feb 02 11:49:56 crc kubenswrapper[4901]: I0202 11:49:56.798217 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t9bm2_5cf937bc-31d1-4321-ad76-c0b26b86d044/registry-server/0.log" Feb 02 11:49:57 crc kubenswrapper[4901]: I0202 11:49:57.044406 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-vccwc_3952ac22-26a4-4b08-a45c-9d8db8597333/manager/0.log" Feb 02 11:49:57 crc kubenswrapper[4901]: I0202 11:49:57.184883 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-dxjpt_27222341-69e8-4b3c-b6c2-e3d5c644e8c3/manager/0.log" Feb 02 11:49:57 crc kubenswrapper[4901]: I0202 11:49:57.460550 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6td2q_2d39bba8-c1e7-4247-b938-616c9774c9a7/operator/0.log" Feb 02 11:49:57 crc kubenswrapper[4901]: I0202 11:49:57.637813 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-xkdvn_3676f3d9-d77a-4809-bf9e-0e5ba2bea27c/manager/0.log" Feb 02 11:49:58 crc kubenswrapper[4901]: I0202 11:49:58.004851 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6765c97497-ngsw7_626937bd-8794-43dd-ab0a-77a94440bb05/manager/0.log" Feb 02 11:49:58 crc kubenswrapper[4901]: I0202 11:49:58.425125 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-vtjmm_93760c22-c570-47d0-a0a8-a0e089ee1461/manager/0.log" Feb 02 11:49:58 crc kubenswrapper[4901]: I0202 11:49:58.563155 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-84dbcd4d6-strlk_9c2b3b8f-1088-4696-899e-d0d5c3f2bf2d/manager/0.log" Feb 02 11:49:58 crc kubenswrapper[4901]: I0202 11:49:58.585011 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-njrpb_9f38ac2f-605c-413e-8bdc-ae236d52bd55/manager/0.log" Feb 02 11:50:07 crc kubenswrapper[4901]: I0202 11:50:07.837083 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:50:07 crc kubenswrapper[4901]: I0202 11:50:07.837817 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:50:21 crc kubenswrapper[4901]: I0202 11:50:21.189316 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2qp9z_713da16e-91d1-4bba-af10-4e9a06ef7c81/control-plane-machine-set-operator/0.log" Feb 02 11:50:21 crc kubenswrapper[4901]: I0202 11:50:21.369863 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f28zg_3fe1fd46-e3a8-4729-9528-24f38fe69252/kube-rbac-proxy/0.log" Feb 02 11:50:21 crc kubenswrapper[4901]: I0202 11:50:21.454823 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-f28zg_3fe1fd46-e3a8-4729-9528-24f38fe69252/machine-api-operator/0.log" Feb 02 11:50:35 crc kubenswrapper[4901]: I0202 11:50:35.946257 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-724k6_535bd180-9d21-44be-be15-dbc0f6fa94cf/cert-manager-controller/0.log" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.194458 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xcbb8_00e3fdef-9614-441e-b027-ad0d29e7f1a8/cert-manager-cainjector/0.log" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.229794 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-qcqhh_69fa051d-cf95-4af2-8d67-990aece23a2c/cert-manager-webhook/0.log" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.359783 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:50:36 crc kubenswrapper[4901]: E0202 11:50:36.360718 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="registry-server" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.360749 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="registry-server" Feb 02 11:50:36 crc kubenswrapper[4901]: E0202 11:50:36.360792 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="extract-utilities" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.360802 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="extract-utilities" Feb 02 11:50:36 crc kubenswrapper[4901]: E0202 11:50:36.360850 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="extract-content" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.360856 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="extract-content" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.361169 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd98ef40-2acb-4d2a-ac3f-61506d6968cc" containerName="registry-server" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.363673 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.374013 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.427121 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.427366 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.427451 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288s7\" (UniqueName: \"kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.529595 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.529689 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-288s7\" (UniqueName: \"kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.529835 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.530234 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.530414 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.886000 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-288s7\" (UniqueName: \"kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7\") pod \"redhat-operators-8tr9m\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:36 crc kubenswrapper[4901]: I0202 11:50:36.996627 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.574407 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.837938 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.838443 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.912091 4901 generic.go:334] "Generic (PLEG): container finished" podID="cc04e691-4309-465a-b7a8-033284141bec" containerID="da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5" exitCode=0 Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.912149 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerDied","Data":"da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5"} Feb 02 11:50:37 crc kubenswrapper[4901]: I0202 11:50:37.912183 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerStarted","Data":"aa8191f06b8baa60ab3af6c89af213603cc682b8c980d5a393d458e8a2a23e45"} Feb 02 11:50:38 crc kubenswrapper[4901]: I0202 11:50:38.926080 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerStarted","Data":"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199"} Feb 02 11:50:44 crc kubenswrapper[4901]: I0202 11:50:44.015455 4901 generic.go:334] "Generic (PLEG): container finished" podID="cc04e691-4309-465a-b7a8-033284141bec" containerID="07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199" exitCode=0 Feb 02 11:50:44 crc kubenswrapper[4901]: I0202 11:50:44.016122 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerDied","Data":"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199"} Feb 02 11:50:45 crc kubenswrapper[4901]: I0202 11:50:45.029631 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerStarted","Data":"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5"} Feb 02 11:50:45 crc kubenswrapper[4901]: I0202 11:50:45.067857 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8tr9m" podStartSLOduration=2.547665035 podStartE2EDuration="9.067825175s" podCreationTimestamp="2026-02-02 11:50:36 +0000 UTC" firstStartedPulling="2026-02-02 11:50:37.914709918 +0000 UTC m=+4324.933050014" lastFinishedPulling="2026-02-02 11:50:44.434870058 +0000 UTC m=+4331.453210154" observedRunningTime="2026-02-02 11:50:45.051572949 +0000 UTC m=+4332.069913065" watchObservedRunningTime="2026-02-02 11:50:45.067825175 +0000 UTC m=+4332.086165271" Feb 02 11:50:46 crc kubenswrapper[4901]: I0202 11:50:46.997726 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:46 crc kubenswrapper[4901]: I0202 11:50:46.998458 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:50:48 crc kubenswrapper[4901]: I0202 11:50:48.122798 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8tr9m" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" probeResult="failure" output=< Feb 02 11:50:48 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:50:48 crc kubenswrapper[4901]: > Feb 02 11:50:52 crc kubenswrapper[4901]: I0202 11:50:52.367091 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-sdv4q_037e715d-ec59-4109-9def-cd55e556e9f4/nmstate-console-plugin/0.log" Feb 02 11:50:52 crc kubenswrapper[4901]: I0202 11:50:52.620061 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ckwfx_9733ec0d-51fa-4932-b0b9-42c3e54bc39e/nmstate-handler/0.log" Feb 02 11:50:52 crc kubenswrapper[4901]: I0202 11:50:52.655055 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-4lrf6_26f249f1-8c84-49ab-b583-948e30dc04f3/kube-rbac-proxy/0.log" Feb 02 11:50:52 crc kubenswrapper[4901]: I0202 11:50:52.791414 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-4lrf6_26f249f1-8c84-49ab-b583-948e30dc04f3/nmstate-metrics/0.log" Feb 02 11:50:52 crc kubenswrapper[4901]: I0202 11:50:52.942810 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-hqzcm_206aa20d-3302-4ae4-a457-6e89f29a4bf7/nmstate-operator/0.log" Feb 02 11:50:53 crc kubenswrapper[4901]: I0202 11:50:53.044864 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-7w8p9_480309f7-ab75-461f-a7bf-075ad02326ca/nmstate-webhook/0.log" Feb 02 11:50:58 crc kubenswrapper[4901]: I0202 11:50:58.056440 4901 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8tr9m" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" probeResult="failure" output=< Feb 02 11:50:58 crc kubenswrapper[4901]: timeout: failed to connect service ":50051" within 1s Feb 02 11:50:58 crc kubenswrapper[4901]: > Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.056611 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.118689 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.550106 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.837702 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.837788 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.837870 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.838830 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:51:07 crc kubenswrapper[4901]: I0202 11:51:07.838910 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76" gracePeriod=600 Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.313909 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76" exitCode=0 Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.313973 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76"} Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.314497 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerStarted","Data":"362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf"} Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.314613 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8tr9m" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" containerID="cri-o://b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5" gracePeriod=2 Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.314554 4901 scope.go:117] "RemoveContainer" containerID="5b8ce7c16052cec2485b442e2fe3c9307bb04efecfe3e58c9fae3418e6341435" Feb 02 11:51:08 crc kubenswrapper[4901]: I0202 11:51:08.870184 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.001879 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-288s7\" (UniqueName: \"kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7\") pod \"cc04e691-4309-465a-b7a8-033284141bec\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.002437 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities\") pod \"cc04e691-4309-465a-b7a8-033284141bec\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.002620 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content\") pod \"cc04e691-4309-465a-b7a8-033284141bec\" (UID: \"cc04e691-4309-465a-b7a8-033284141bec\") " Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.003516 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities" (OuterVolumeSpecName: "utilities") pod "cc04e691-4309-465a-b7a8-033284141bec" (UID: "cc04e691-4309-465a-b7a8-033284141bec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.018834 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.021938 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7" (OuterVolumeSpecName: "kube-api-access-288s7") pod "cc04e691-4309-465a-b7a8-033284141bec" (UID: "cc04e691-4309-465a-b7a8-033284141bec"). InnerVolumeSpecName "kube-api-access-288s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.120483 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-288s7\" (UniqueName: \"kubernetes.io/projected/cc04e691-4309-465a-b7a8-033284141bec-kube-api-access-288s7\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.143278 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc04e691-4309-465a-b7a8-033284141bec" (UID: "cc04e691-4309-465a-b7a8-033284141bec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.222889 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc04e691-4309-465a-b7a8-033284141bec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.335456 4901 generic.go:334] "Generic (PLEG): container finished" podID="cc04e691-4309-465a-b7a8-033284141bec" containerID="b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5" exitCode=0 Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.335539 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerDied","Data":"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5"} Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.335647 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8tr9m" event={"ID":"cc04e691-4309-465a-b7a8-033284141bec","Type":"ContainerDied","Data":"aa8191f06b8baa60ab3af6c89af213603cc682b8c980d5a393d458e8a2a23e45"} Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.335656 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8tr9m" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.335702 4901 scope.go:117] "RemoveContainer" containerID="b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.365716 4901 scope.go:117] "RemoveContainer" containerID="07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.391779 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.405257 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8tr9m"] Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.410804 4901 scope.go:117] "RemoveContainer" containerID="da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.452272 4901 scope.go:117] "RemoveContainer" containerID="b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5" Feb 02 11:51:09 crc kubenswrapper[4901]: E0202 11:51:09.452972 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5\": container with ID starting with b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5 not found: ID does not exist" containerID="b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.453036 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5"} err="failed to get container status \"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5\": rpc error: code = NotFound desc = could not find container \"b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5\": container with ID starting with b08a1911077d8db316f65e7aa5ac89aefb5e6b99ae63af77cf499dc90dc744c5 not found: ID does not exist" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.453071 4901 scope.go:117] "RemoveContainer" containerID="07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199" Feb 02 11:51:09 crc kubenswrapper[4901]: E0202 11:51:09.453491 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199\": container with ID starting with 07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199 not found: ID does not exist" containerID="07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.453585 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199"} err="failed to get container status \"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199\": rpc error: code = NotFound desc = could not find container \"07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199\": container with ID starting with 07b1d5af1c0d3ebb8fe36c0efd8a278681bd43bef9a19358e61a06f68c572199 not found: ID does not exist" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.453627 4901 scope.go:117] "RemoveContainer" containerID="da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5" Feb 02 11:51:09 crc kubenswrapper[4901]: E0202 11:51:09.454064 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5\": container with ID starting with da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5 not found: ID does not exist" containerID="da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.454100 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5"} err="failed to get container status \"da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5\": rpc error: code = NotFound desc = could not find container \"da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5\": container with ID starting with da8504b3a0c1c0b1b2b4d06ced6ed180539003dca36e701dc0aea61c20525be5 not found: ID does not exist" Feb 02 11:51:09 crc kubenswrapper[4901]: I0202 11:51:09.722386 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc04e691-4309-465a-b7a8-033284141bec" path="/var/lib/kubelet/pods/cc04e691-4309-465a-b7a8-033284141bec/volumes" Feb 02 11:51:12 crc kubenswrapper[4901]: I0202 11:51:12.982156 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-58vm8_624ec5ba-9a1f-4192-a537-c0cc6c8d5c24/prometheus-operator/0.log" Feb 02 11:51:13 crc kubenswrapper[4901]: I0202 11:51:13.013981 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57f656956-dt5d8_78d5a33d-6a61-4e38-8b5c-9a8bb8436628/prometheus-operator-admission-webhook/0.log" Feb 02 11:51:13 crc kubenswrapper[4901]: I0202 11:51:13.283859 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57f656956-fhbnb_650a29c1-9b38-4b85-9104-d56f42d0d2d9/prometheus-operator-admission-webhook/0.log" Feb 02 11:51:13 crc kubenswrapper[4901]: I0202 11:51:13.302985 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7f2km_2d9b21bd-d1dd-4c42-974f-9aa80352637f/operator/0.log" Feb 02 11:51:13 crc kubenswrapper[4901]: I0202 11:51:13.522898 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dqgl6_169c8594-4455-4fd1-9602-8dabcd5828de/perses-operator/0.log" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.294378 4901 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:21 crc kubenswrapper[4901]: E0202 11:51:21.295680 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.295696 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" Feb 02 11:51:21 crc kubenswrapper[4901]: E0202 11:51:21.295706 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="extract-utilities" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.295714 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="extract-utilities" Feb 02 11:51:21 crc kubenswrapper[4901]: E0202 11:51:21.295736 4901 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="extract-content" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.295742 4901 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="extract-content" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.295978 4901 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc04e691-4309-465a-b7a8-033284141bec" containerName="registry-server" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.299225 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.318323 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.447015 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.447099 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.447792 4901 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfthg\" (UniqueName: \"kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.549626 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.549873 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfthg\" (UniqueName: \"kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.549911 4901 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.550367 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.550422 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.578768 4901 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfthg\" (UniqueName: \"kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg\") pod \"community-operators-2w59j\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:21 crc kubenswrapper[4901]: I0202 11:51:21.625626 4901 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:22 crc kubenswrapper[4901]: I0202 11:51:22.356088 4901 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:22 crc kubenswrapper[4901]: I0202 11:51:22.473528 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerStarted","Data":"3c7f021481c707ddaff4639774328b4488e21497c7e2730d5e1ba1c535b2f14e"} Feb 02 11:51:23 crc kubenswrapper[4901]: I0202 11:51:23.509710 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0be299c-d141-479f-af36-a1f6052a0e85" containerID="535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75" exitCode=0 Feb 02 11:51:23 crc kubenswrapper[4901]: I0202 11:51:23.510449 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerDied","Data":"535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75"} Feb 02 11:51:24 crc kubenswrapper[4901]: I0202 11:51:24.523979 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerStarted","Data":"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd"} Feb 02 11:51:26 crc kubenswrapper[4901]: I0202 11:51:26.548783 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0be299c-d141-479f-af36-a1f6052a0e85" containerID="da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd" exitCode=0 Feb 02 11:51:26 crc kubenswrapper[4901]: I0202 11:51:26.549582 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerDied","Data":"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd"} Feb 02 11:51:27 crc kubenswrapper[4901]: I0202 11:51:27.567942 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerStarted","Data":"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487"} Feb 02 11:51:27 crc kubenswrapper[4901]: I0202 11:51:27.591208 4901 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2w59j" podStartSLOduration=3.041257291 podStartE2EDuration="6.591185976s" podCreationTimestamp="2026-02-02 11:51:21 +0000 UTC" firstStartedPulling="2026-02-02 11:51:23.518925456 +0000 UTC m=+4370.537265552" lastFinishedPulling="2026-02-02 11:51:27.068854141 +0000 UTC m=+4374.087194237" observedRunningTime="2026-02-02 11:51:27.589878304 +0000 UTC m=+4374.608218410" watchObservedRunningTime="2026-02-02 11:51:27.591185976 +0000 UTC m=+4374.609526072" Feb 02 11:51:31 crc kubenswrapper[4901]: I0202 11:51:31.626784 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:31 crc kubenswrapper[4901]: I0202 11:51:31.627985 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:31 crc kubenswrapper[4901]: I0202 11:51:31.699303 4901 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.120499 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-5ds9w_1c6a7816-6119-496c-9298-fb8121c9e38f/kube-rbac-proxy/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.190888 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-5ds9w_1c6a7816-6119-496c-9298-fb8121c9e38f/controller/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.505040 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-frr-files/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.648303 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-reloader/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.653801 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-frr-files/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.692909 4901 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.729212 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-metrics/0.log" Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.769297 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:32 crc kubenswrapper[4901]: I0202 11:51:32.793075 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-reloader/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.033241 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-frr-files/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.036999 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-metrics/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.087252 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-reloader/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.088191 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-metrics/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.352870 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-frr-files/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.357963 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-reloader/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.424405 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/controller/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.427786 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/cp-metrics/0.log" Feb 02 11:51:33 crc kubenswrapper[4901]: I0202 11:51:33.623547 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/frr-metrics/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.068354 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/kube-rbac-proxy/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.078185 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/kube-rbac-proxy-frr/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.251837 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/reloader/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.338653 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-rr5bw_7389b305-4824-4f3b-820e-9466214ea9b1/frr-k8s-webhook-server/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.641896 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2w59j" podUID="f0be299c-d141-479f-af36-a1f6052a0e85" containerName="registry-server" containerID="cri-o://7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487" gracePeriod=2 Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.701492 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fc654b48b-g84wt_9dcb06dd-41b2-459b-9fe0-b55fc062f05c/manager/0.log" Feb 02 11:51:34 crc kubenswrapper[4901]: I0202 11:51:34.937663 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-8c4f569b7-2bbgt_a5d5e21a-619a-4ae6-b36e-bb61a555a29b/webhook-server/0.log" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.033341 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dppcw_5c595fcf-9396-4426-bff4-84cd6eda9dc5/kube-rbac-proxy/0.log" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.235092 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.402462 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content\") pod \"f0be299c-d141-479f-af36-a1f6052a0e85\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.403090 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfthg\" (UniqueName: \"kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg\") pod \"f0be299c-d141-479f-af36-a1f6052a0e85\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.403151 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities\") pod \"f0be299c-d141-479f-af36-a1f6052a0e85\" (UID: \"f0be299c-d141-479f-af36-a1f6052a0e85\") " Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.405546 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities" (OuterVolumeSpecName: "utilities") pod "f0be299c-d141-479f-af36-a1f6052a0e85" (UID: "f0be299c-d141-479f-af36-a1f6052a0e85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.402660 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5c4zt_274b5333-5608-463a-a844-de0f51548386/frr/0.log" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.433803 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg" (OuterVolumeSpecName: "kube-api-access-vfthg") pod "f0be299c-d141-479f-af36-a1f6052a0e85" (UID: "f0be299c-d141-479f-af36-a1f6052a0e85"). InnerVolumeSpecName "kube-api-access-vfthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.505659 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfthg\" (UniqueName: \"kubernetes.io/projected/f0be299c-d141-479f-af36-a1f6052a0e85-kube-api-access-vfthg\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.505955 4901 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.510195 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0be299c-d141-479f-af36-a1f6052a0e85" (UID: "f0be299c-d141-479f-af36-a1f6052a0e85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.608150 4901 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0be299c-d141-479f-af36-a1f6052a0e85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.697257 4901 generic.go:334] "Generic (PLEG): container finished" podID="f0be299c-d141-479f-af36-a1f6052a0e85" containerID="7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487" exitCode=0 Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.697443 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2w59j" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.715477 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerDied","Data":"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487"} Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.715557 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2w59j" event={"ID":"f0be299c-d141-479f-af36-a1f6052a0e85","Type":"ContainerDied","Data":"3c7f021481c707ddaff4639774328b4488e21497c7e2730d5e1ba1c535b2f14e"} Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.715621 4901 scope.go:117] "RemoveContainer" containerID="7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.786399 4901 scope.go:117] "RemoveContainer" containerID="da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.800754 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.809562 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2w59j"] Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.825460 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dppcw_5c595fcf-9396-4426-bff4-84cd6eda9dc5/speaker/0.log" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.833832 4901 scope.go:117] "RemoveContainer" containerID="535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.902286 4901 scope.go:117] "RemoveContainer" containerID="7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487" Feb 02 11:51:35 crc kubenswrapper[4901]: E0202 11:51:35.902941 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487\": container with ID starting with 7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487 not found: ID does not exist" containerID="7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.902989 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487"} err="failed to get container status \"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487\": rpc error: code = NotFound desc = could not find container \"7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487\": container with ID starting with 7afe7043188e76a156a6a8825222d6d3962bda7689ca3b88b2c2e0c73ca08487 not found: ID does not exist" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.903021 4901 scope.go:117] "RemoveContainer" containerID="da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd" Feb 02 11:51:35 crc kubenswrapper[4901]: E0202 11:51:35.903958 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd\": container with ID starting with da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd not found: ID does not exist" containerID="da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.903987 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd"} err="failed to get container status \"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd\": rpc error: code = NotFound desc = could not find container \"da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd\": container with ID starting with da512add93f595e8f1f8ea7701e4b557e3f33ea2ee1b2ebe3d9dca4016a189bd not found: ID does not exist" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.904008 4901 scope.go:117] "RemoveContainer" containerID="535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75" Feb 02 11:51:35 crc kubenswrapper[4901]: E0202 11:51:35.904333 4901 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75\": container with ID starting with 535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75 not found: ID does not exist" containerID="535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75" Feb 02 11:51:35 crc kubenswrapper[4901]: I0202 11:51:35.904364 4901 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75"} err="failed to get container status \"535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75\": rpc error: code = NotFound desc = could not find container \"535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75\": container with ID starting with 535158342a1f86ac27d0e14aee71a23c592053688bd040a1e26e8636882daa75 not found: ID does not exist" Feb 02 11:51:37 crc kubenswrapper[4901]: I0202 11:51:37.691314 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0be299c-d141-479f-af36-a1f6052a0e85" path="/var/lib/kubelet/pods/f0be299c-d141-479f-af36-a1f6052a0e85/volumes" Feb 02 11:51:53 crc kubenswrapper[4901]: I0202 11:51:53.412852 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/util/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.338539 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/pull/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.351975 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/util/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.376171 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/pull/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.589218 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/extract/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.616151 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/pull/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.657545 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc7w9pj_2b710f66-8d54-4133-ae8f-9a05af592ada/util/0.log" Feb 02 11:51:54 crc kubenswrapper[4901]: I0202 11:51:54.819845 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/util/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.037559 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/util/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.058670 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/pull/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.106399 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/pull/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.276689 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/pull/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.309092 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/extract/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.320053 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713jtlrf_635aac29-bf3a-4517-bde5-4dd65f084a22/util/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.464585 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/util/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.724972 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/util/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.737695 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/pull/0.log" Feb 02 11:51:55 crc kubenswrapper[4901]: I0202 11:51:55.803896 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/pull/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.012354 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/pull/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.053770 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/extract/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.062715 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08rbjw8_01f21b27-c12d-47f6-bf3c-9976e9ef968e/util/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.221483 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-utilities/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.464709 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-utilities/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.504352 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-content/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.541185 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-content/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.818438 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-content/0.log" Feb 02 11:51:56 crc kubenswrapper[4901]: I0202 11:51:56.819528 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/extract-utilities/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.417955 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-utilities/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.582671 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g7pnc_86b5a97c-571b-418c-af97-26b04dec66ac/registry-server/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.616652 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-utilities/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.728409 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-content/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.749893 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-content/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.942663 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-utilities/0.log" Feb 02 11:51:57 crc kubenswrapper[4901]: I0202 11:51:57.987002 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/extract-content/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.048779 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-b9c2g_eaf0b215-2c7f-4cf7-9682-983acfa5ccb3/marketplace-operator/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.189864 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-utilities/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.522947 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-utilities/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.611816 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-content/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.649713 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nmgnj_4d1986be-5828-4d64-9da9-ffe87c0eb7ff/registry-server/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.682473 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-content/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.777715 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-content/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.833192 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/extract-utilities/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.969424 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-utilities/0.log" Feb 02 11:51:58 crc kubenswrapper[4901]: I0202 11:51:58.993817 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j4fbj_a3aac041-08a3-4d4e-aada-69ada2387b41/registry-server/0.log" Feb 02 11:51:59 crc kubenswrapper[4901]: I0202 11:51:59.134769 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-content/0.log" Feb 02 11:51:59 crc kubenswrapper[4901]: I0202 11:51:59.197769 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-utilities/0.log" Feb 02 11:51:59 crc kubenswrapper[4901]: I0202 11:51:59.253921 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-content/0.log" Feb 02 11:51:59 crc kubenswrapper[4901]: I0202 11:51:59.420262 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-utilities/0.log" Feb 02 11:51:59 crc kubenswrapper[4901]: I0202 11:51:59.464318 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/extract-content/0.log" Feb 02 11:52:00 crc kubenswrapper[4901]: I0202 11:52:00.229970 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-95qfs_7a5d5ce8-6c11-4a6a-8e35-1b4809458b8e/registry-server/0.log" Feb 02 11:52:16 crc kubenswrapper[4901]: I0202 11:52:16.608637 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57f656956-dt5d8_78d5a33d-6a61-4e38-8b5c-9a8bb8436628/prometheus-operator-admission-webhook/0.log" Feb 02 11:52:16 crc kubenswrapper[4901]: I0202 11:52:16.609006 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57f656956-fhbnb_650a29c1-9b38-4b85-9104-d56f42d0d2d9/prometheus-operator-admission-webhook/0.log" Feb 02 11:52:16 crc kubenswrapper[4901]: I0202 11:52:16.619396 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-58vm8_624ec5ba-9a1f-4192-a537-c0cc6c8d5c24/prometheus-operator/0.log" Feb 02 11:52:16 crc kubenswrapper[4901]: I0202 11:52:16.839348 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-7f2km_2d9b21bd-d1dd-4c42-974f-9aa80352637f/operator/0.log" Feb 02 11:52:16 crc kubenswrapper[4901]: I0202 11:52:16.843658 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dqgl6_169c8594-4455-4fd1-9602-8dabcd5828de/perses-operator/0.log" Feb 02 11:52:20 crc kubenswrapper[4901]: E0202 11:52:20.798971 4901 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.227:46530->38.102.83.227:43245: write tcp 38.102.83.227:46530->38.102.83.227:43245: write: broken pipe Feb 02 11:53:37 crc kubenswrapper[4901]: I0202 11:53:37.837863 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:53:37 crc kubenswrapper[4901]: I0202 11:53:37.838536 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:53:51 crc kubenswrapper[4901]: I0202 11:53:51.173847 4901 scope.go:117] "RemoveContainer" containerID="38351da7203a316bb88a9bc4bc9fd5b527c4da7056320b778191ec093675b25c" Feb 02 11:54:04 crc kubenswrapper[4901]: I0202 11:54:04.453362 4901 generic.go:334] "Generic (PLEG): container finished" podID="2e79fb4d-bc49-4cff-a362-ef307ef70412" containerID="11a7caca7022eaa7cd9389178815a6e4321dd1e73e4a64c9ba15211741a05c37" exitCode=0 Feb 02 11:54:04 crc kubenswrapper[4901]: I0202 11:54:04.453495 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9dw4/must-gather-r4654" event={"ID":"2e79fb4d-bc49-4cff-a362-ef307ef70412","Type":"ContainerDied","Data":"11a7caca7022eaa7cd9389178815a6e4321dd1e73e4a64c9ba15211741a05c37"} Feb 02 11:54:04 crc kubenswrapper[4901]: I0202 11:54:04.455208 4901 scope.go:117] "RemoveContainer" containerID="11a7caca7022eaa7cd9389178815a6e4321dd1e73e4a64c9ba15211741a05c37" Feb 02 11:54:04 crc kubenswrapper[4901]: I0202 11:54:04.949168 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9dw4_must-gather-r4654_2e79fb4d-bc49-4cff-a362-ef307ef70412/gather/0.log" Feb 02 11:54:07 crc kubenswrapper[4901]: I0202 11:54:07.838173 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:54:07 crc kubenswrapper[4901]: I0202 11:54:07.839075 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:13 crc kubenswrapper[4901]: I0202 11:54:13.370149 4901 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9dw4/must-gather-r4654"] Feb 02 11:54:13 crc kubenswrapper[4901]: I0202 11:54:13.370796 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t9dw4/must-gather-r4654" podUID="2e79fb4d-bc49-4cff-a362-ef307ef70412" containerName="copy" containerID="cri-o://39e747d5c9bfc3db9032a65e1c0d7d365078b45e4b60c820158e1280509fcfb9" gracePeriod=2 Feb 02 11:54:13 crc kubenswrapper[4901]: I0202 11:54:13.382147 4901 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9dw4/must-gather-r4654"] Feb 02 11:54:13 crc kubenswrapper[4901]: I0202 11:54:13.548869 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9dw4_must-gather-r4654_2e79fb4d-bc49-4cff-a362-ef307ef70412/copy/0.log" Feb 02 11:54:13 crc kubenswrapper[4901]: I0202 11:54:13.549787 4901 generic.go:334] "Generic (PLEG): container finished" podID="2e79fb4d-bc49-4cff-a362-ef307ef70412" containerID="39e747d5c9bfc3db9032a65e1c0d7d365078b45e4b60c820158e1280509fcfb9" exitCode=143 Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.052526 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9dw4_must-gather-r4654_2e79fb4d-bc49-4cff-a362-ef307ef70412/copy/0.log" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.059545 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.126616 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5v8\" (UniqueName: \"kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8\") pod \"2e79fb4d-bc49-4cff-a362-ef307ef70412\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.126814 4901 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output\") pod \"2e79fb4d-bc49-4cff-a362-ef307ef70412\" (UID: \"2e79fb4d-bc49-4cff-a362-ef307ef70412\") " Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.136370 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8" (OuterVolumeSpecName: "kube-api-access-cf5v8") pod "2e79fb4d-bc49-4cff-a362-ef307ef70412" (UID: "2e79fb4d-bc49-4cff-a362-ef307ef70412"). InnerVolumeSpecName "kube-api-access-cf5v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.228714 4901 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5v8\" (UniqueName: \"kubernetes.io/projected/2e79fb4d-bc49-4cff-a362-ef307ef70412-kube-api-access-cf5v8\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.307103 4901 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2e79fb4d-bc49-4cff-a362-ef307ef70412" (UID: "2e79fb4d-bc49-4cff-a362-ef307ef70412"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.331137 4901 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e79fb4d-bc49-4cff-a362-ef307ef70412-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.563051 4901 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9dw4_must-gather-r4654_2e79fb4d-bc49-4cff-a362-ef307ef70412/copy/0.log" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.563661 4901 scope.go:117] "RemoveContainer" containerID="39e747d5c9bfc3db9032a65e1c0d7d365078b45e4b60c820158e1280509fcfb9" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.563702 4901 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9dw4/must-gather-r4654" Feb 02 11:54:14 crc kubenswrapper[4901]: I0202 11:54:14.590458 4901 scope.go:117] "RemoveContainer" containerID="11a7caca7022eaa7cd9389178815a6e4321dd1e73e4a64c9ba15211741a05c37" Feb 02 11:54:15 crc kubenswrapper[4901]: I0202 11:54:15.692296 4901 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e79fb4d-bc49-4cff-a362-ef307ef70412" path="/var/lib/kubelet/pods/2e79fb4d-bc49-4cff-a362-ef307ef70412/volumes" Feb 02 11:54:37 crc kubenswrapper[4901]: I0202 11:54:37.837344 4901 patch_prober.go:28] interesting pod/machine-config-daemon-f29d8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:54:37 crc kubenswrapper[4901]: I0202 11:54:37.838093 4901 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:37 crc kubenswrapper[4901]: I0202 11:54:37.838164 4901 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" Feb 02 11:54:37 crc kubenswrapper[4901]: I0202 11:54:37.839396 4901 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf"} pod="openshift-machine-config-operator/machine-config-daemon-f29d8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:54:37 crc kubenswrapper[4901]: I0202 11:54:37.839449 4901 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" containerName="machine-config-daemon" containerID="cri-o://362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" gracePeriod=600 Feb 02 11:54:38 crc kubenswrapper[4901]: E0202 11:54:38.316205 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:54:38 crc kubenswrapper[4901]: I0202 11:54:38.897696 4901 generic.go:334] "Generic (PLEG): container finished" podID="756c113d-5d5e-424e-bdf5-494b7774def6" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" exitCode=0 Feb 02 11:54:38 crc kubenswrapper[4901]: I0202 11:54:38.897752 4901 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" event={"ID":"756c113d-5d5e-424e-bdf5-494b7774def6","Type":"ContainerDied","Data":"362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf"} Feb 02 11:54:38 crc kubenswrapper[4901]: I0202 11:54:38.897852 4901 scope.go:117] "RemoveContainer" containerID="ed700bff0964ab4e3cffb061f658a435387232649d99d0524069bea25950bd76" Feb 02 11:54:38 crc kubenswrapper[4901]: I0202 11:54:38.898775 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:54:38 crc kubenswrapper[4901]: E0202 11:54:38.899208 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:54:54 crc kubenswrapper[4901]: I0202 11:54:54.676774 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:54:54 crc kubenswrapper[4901]: E0202 11:54:54.677687 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:55:09 crc kubenswrapper[4901]: I0202 11:55:09.677655 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:55:09 crc kubenswrapper[4901]: E0202 11:55:09.680506 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:55:24 crc kubenswrapper[4901]: I0202 11:55:24.677526 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:55:24 crc kubenswrapper[4901]: E0202 11:55:24.678492 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:55:37 crc kubenswrapper[4901]: I0202 11:55:37.720261 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:55:37 crc kubenswrapper[4901]: E0202 11:55:37.721932 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:55:52 crc kubenswrapper[4901]: I0202 11:55:52.677200 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:55:52 crc kubenswrapper[4901]: E0202 11:55:52.678657 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:56:05 crc kubenswrapper[4901]: I0202 11:56:05.677988 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:56:05 crc kubenswrapper[4901]: E0202 11:56:05.678985 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:56:17 crc kubenswrapper[4901]: I0202 11:56:17.677676 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:56:17 crc kubenswrapper[4901]: E0202 11:56:17.678952 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:56:30 crc kubenswrapper[4901]: I0202 11:56:30.676247 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:56:30 crc kubenswrapper[4901]: E0202 11:56:30.677296 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:56:43 crc kubenswrapper[4901]: I0202 11:56:43.679353 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:56:43 crc kubenswrapper[4901]: E0202 11:56:43.680366 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:56:58 crc kubenswrapper[4901]: I0202 11:56:58.678135 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:56:58 crc kubenswrapper[4901]: E0202 11:56:58.679737 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:57:09 crc kubenswrapper[4901]: I0202 11:57:09.677976 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:57:09 crc kubenswrapper[4901]: E0202 11:57:09.681264 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:57:24 crc kubenswrapper[4901]: I0202 11:57:24.678087 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:57:24 crc kubenswrapper[4901]: E0202 11:57:24.679270 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:57:35 crc kubenswrapper[4901]: I0202 11:57:35.677235 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:57:35 crc kubenswrapper[4901]: E0202 11:57:35.679576 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6" Feb 02 11:57:47 crc kubenswrapper[4901]: I0202 11:57:47.677360 4901 scope.go:117] "RemoveContainer" containerID="362b45ff1fa8ca1b90b1f169672891d22665c51e33cf128a41e8765df1974acf" Feb 02 11:57:47 crc kubenswrapper[4901]: E0202 11:57:47.678269 4901 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f29d8_openshift-machine-config-operator(756c113d-5d5e-424e-bdf5-494b7774def6)\"" pod="openshift-machine-config-operator/machine-config-daemon-f29d8" podUID="756c113d-5d5e-424e-bdf5-494b7774def6"